API and Plugin Development

Home » Help »

InterroBot offers powerful API and plugin development capabilities, allowing you to leverage InterroBot's best-in-class crawl filtering toward your own ends. This guide covers the basics of working with the InterroBot API and creating plugins from a variety of crawl data.

InterroBot API Overview

The Plugin API provides a custom web crawler and web scraping API for developers, enabling you to create a custom report, visualization, or interactive app from from raw crawl data.

Technical Sandbox

If you're just getting started, InterroBot provides a technical API sandbox in-app. You can get there from the Options page by navigating to the "Crawler Data API."

The sandbox is the best place to suss out plugin capabilities and data access. The sandbox provides a programmable web crawler interface where you can experiment with different queries and see real-time results. Its list represents the complete data API, and the canonical API definition.

The box above the JSON response is the query you can use to generate the a predictable result from the form arguments, or as a quick-start from which to build your own queries.

Navigating to the API interface via Options, in-app.

Navigating to the API interface via Options, in-app.

API Usage

JavaScript API Access

JavaScript can access the data through the interrobot-plugin library. With the library, you can build web crawler plugins that extend InterroBot's functionality to suit your specific needs. Please refer to the repository for details on setting up a development environment. Windows users will need to use a special debug build to access devtools, while macOS should have everything they need in the stock build from App Store.

Core API Methods

Most of the heavy lifting comes in the form of three API methods: GetProjects, GetResources, and GetCrawls. These methods form the backbone of InterroBot's web crawler data extraction API, allowing you to retrieve and process crawled information efficiently. Between these methods, you can transform your website data into tables of data, visualizations, or in interactive UIs.

Technical documentation for the plugin is hosted on GitHub.

Example API Requests

Retrieve a list of all projects, with the icon image data

let result;  
result = await Plugin.postApiRequest("GetProjects",  {
    fields: ["image"],
    projects: []
});

Retrieve a list of pages with header data.

result = await Plugin.postApiRequest("GetResources", {
    external: true,
    fields: ["headers"],
    offset: 0,
    query: "headers: image",
    project: <projectId>,
    type: "any",
});

Retrieve a list of crawls by project.

const crawls = await Plugin.postApiRequest("GetCrawls", {
    complete: "any",
    fields: [],
    project: <projectId>,
});

To get a better feel for what data and filters are available, see the sandbox, and to get an idea of what is possible, check out the plugins directory full of plugin examples, many of which are open source.

The API/Plugin Connection

In InterroBot's architecture, the API and plugin development are intrinsically linked. Every API interaction occurs within a plugin context, meaning you'll always be developing a plugin when using the API.

This design choice provides a consistent framework for extending InterroBot's functionality. While it may differ from traditional API usage, it allows for deeper integration with InterroBot's core features. This design allows you to create a specialized web crawler for content analysis, SEO audits, or any other specific use case you might have. As you work with the API, you'll simultaneously be building plugins that can leverage InterroBot's full range of capabilities, from data retrieval to complex analysis.

For scenarios requiring data access outside the plugin context, InterroBot offers alternative solutions:

  1. Most pages within InterroBot provide exportable data in standard formats.
  2. For comprehensive data access, you can export the entire SQLite database, which contains all crawl data and associated metadata.

These export options ensure flexibility in data utilization while maintaining the integrity of the plugin-based API architecture.

Future Development

InterroBot plugins represent a new feature (summer '24), more technical details will be forthcoming, though the interrobot-plugin GitHub repo may be more active than this webpage, at this stage. Contact me through the app if you have any questions.


InterroBot is a web crawler and developer tool for Windows, macOS, and Android.
Want to learn more? Check out our help section or download the latest build.