We monitor all
the layers of the internet
2 to 20% of the web (centralised). Content indexed by search engines.
Public websites, social media (public data).
Databases (cloud, drives) that are connected to the Internet, connected objects (IoT).
80 to 90% of the accessible web that is poorly or not indexed (centralised). Forums, sites
that don’t want to be indexed.
The non-accessible web (decentralised). Around 350,000 URLs, including 30,000 that are active.
GrayMatter, our key technology, allows us to:
- Collect: retrieve and update information.
- Index: categorise and consolidate data by pulling out the most useful information.
- Analyse: relationships, clusters of influence, social media (public data); the digital footprint of a person, brand, product, or any other activity (document, transaction, IP address, etc).
- Display: structure, display, and map out massive volumes of data in near real-time to pull out information that is not initially visible (such as links between pages).
GrayMatter removes 3
Massively parallel collection
Processing the massive amounts of data generated online requires specific parallel algorithms. GrayMatter’s capacity to update data surpasses all other solutions on the market. The core focus of Aleph’s R&D is addressing the relationship between the number of data sources and performance.
Processing and merging disparate data
GrayMatter uses multi-source, multi-protocol technology and features a merging tool that is separate from data structures. It can manage and analyse disparate structures online (particularly in the Deep et Dark Webs). Unlike other tools on the market, its power and pertinence provide users (private companies, NGOs, government and intelligence departments, etc.) sufficiently detailed information.
Independence and discretion
GrayMatter is the centrepiece of the only search engine for the Deep and Dark Webs that is independent of any API. It offers detailed and in-depth analysis tools while allowing users to remain in control.