
Collect information via RSS feed
An RSS (Really Simple Syndication) feed is an XML format for automatically distributing updates from websites, such as blogs and news. Users subscribe via an RSS reader, centralizing new content in one place without having to visit each site manually. It's an effective tool for keeping abreast of the latest publications.
Cikisi collects RSS feeds using a variety of techniques, and makes them available in your personal space.

Collecting information via a Scrapping bot
A scraping bot is designed to extract specific data from websites. Unlike crawlers, which scan and index pages for search engines, scraping bots target and copy specific information, such as product prices, customer reviews or contact lists. This data is then used for analysis, competitor monitoring or database integration. Scraping can be controversial, as it can violate web site terms of use and raise issues of intellectual property and server load. So it's important to make sure you're compliant with every website's GDPR.

Gathering information with a website crawler
A website crawler is an automated program designed to systematically crawl web pages. It extracts information from these pages for various uses, including indexing by search engines such as Google. Crawlers analyze the content, links and structure of web sites to understand their relevance and quality, making it easier to find and rank pages in search engine results.