At the core of our services is our unique software Stratalis Cloud™ - our custom web scraping platform, as well as our team of data extraction programmers.
Our programmers build robots that are individually customised to crawl the websites that you need to extract data from. Our programmers follow your project brief to ensure information is obtained from the correct destinations. After this stage is completed the collected data is pooled together on an accessible format, such as Excel, CSV or a format that has been customized to your specifications.
The specifications of your project brief will be fine-tuned with our Project Managers and yourselves to produce a precise and unambiguous brief from which to work from. By doing this we reduce the likelihood of errors, and alleviate any stress or confusion . Should any issues arise during the project, clarification will be sought with you.
As soon as the robot is created we will plug it into Stratalis Cloud™. The cloud enables your robots to access any website, at any time. It features every technical feature needed to remove obstacles created by complex or self-obstructed websites.
This service is available for either one off or recurring projects. The majority of our clients, however, require their data feed refreshed every day and in some cases multiple times per day.
The many possibilities offered by our web scraping robots include:
- Crawling HTML sites
- Navigating obscured and complex sites
- Submit forms and perform multi-page actions
- Real-time data extraction
- Extract text from most downloadable document formats (PDF, Word, Excel, etc.)
- Extract text and numbers from images
- By-pass most captchas and IP blocks
- Use large numbers of IP addresses for a single crawl
- Closely simulate real human interaction with a website
- Scrape several million pages
Get in touch with Stratalis, to discuss your project and how we can help you.