- Search the web to ground AI on real world data (facts, docs, events)
- Get the markdown content of any web page
- Discover and filter out relevant URLs within a website.
Installation
You need to have an API key. Get the API key from https://www.olostep.com/dashboard/api-keys. You can either use our remote hosted URL or run the server locallyRemote hosted URL
Running with npx
Manual Installation
Running on Claude Desktop
Add this to your claude_desktop_config.json:Running on Claude Code
Run this in your terminalRunning on Cursor
Manual setup
To configure Olostep MCP in Cursor:- Open Cursor Settings (click on the settings icon in the top right of the IDE)
- Click on MCP
- Click on ”+ Add new global MCP Server”
- Enter the following in the mcp.json file:
- Remember to replace
YOUR_API_KEYwith your actual API key that you can get from the Olostep dashboard - You can close the settings tab now. If needed refresh the MCP server list to see the new tools
If you are using Windows and are having issues, try cmd /c "set OLOSTEP_API_KEY=your-api-key && npx -y olostep-mcp"
Next time that you use Cursor it will automatically use the Olostep MCP when needed. You can also explicitly request to use the Olostep MCP when you need up to date information from the web
Running on Windsurf
Add this to your ./codeium/windsurf/model_config.json:Available Tools
- scrape_website: Extract content from a single URL. Supports multiple formats and JavaScript rendering.
- search_web: Search the web for a given query and return structured results (non-AI, parser-based).
- answers: Search the web and return AI-powered answers in the JSON structure you want, with sources and citations.
- batch_scrape_urls: Scrape up to 10k URLs at the same time. Perfect for large-scale data extraction.
- get_batch_results: Retrieve the status and scraped content for a batch job. If completed, fetches items and retrieves content for each URL.
- create_crawl: Autonomously discover and scrape entire websites by following links from a start URL.
- create_map: Get all URLs on a website. Extract URLs for discovery and site analysis.
- get_webpage_content: Retrieve content of a webpage in markdown format.
- get_website_urls: Search and retrieve relevant URLs from a website.
- google_search: Retrieve structured data from Google search results.