Available Connectors
Last updated
Last updated
First Time User? Free Credits on us!
Get up to $200 worth of Free credits when you sign up for the first time. Use this link to create a new account on Condense and claim your free credits!
The user-friendly Condense App provides all the tools you need to build and launch it quickly. Condense offers everything from Telematics devices to connect real-world data, to Stream and Store Connectors for managing information flow, to Databases for storing your solution's data. You can even connect it to a user-facing app, a Power BI dashboard, or even link it to your own external database. With Condense, launching your solution is faster and easier than ever!
Connector Type: Input
Connector Category: Telematics Device
Description: Zeliot’s Condense Edge is a modular low memory footprint embedded firmware enabling data collection and transfer of rich datasets generated from vehicles and OTA of the Vehicle ECUs. The data could further be used to derive actionable insights increasing the performance of the vehicles and derive failsafe mechanisms to save costs and drive high uptime of the vehicle.
Configurable Parameters: Telematics Device Type, Title, Input Topic, Output Topic
Know more about Condense Edge here: https://www.zeliot.in/our-products/condense-edge
Connector Type: Input
Connector Category: Telematics Device
Description: Headquartered in Bangalore, iTriangle Infotech is a leading force in the Vehicle Telematics industry and is renowned as India's largest manufacturer of Vehicle Telematics Solutions. We specialize in providing comprehensive solutions for Transport, Logistics, Education, OEMs, Vehicle Manufacturers, EVs, Healthcare, and the Automobile Industry. Our proprietary Remote Data Acquisition Platform (GSM-GPS-RFID-Sensors) supports Vehicle Tracking, Fleet Management, Personal Tracking, Safety, Diagnostics, and Prognostics.
Integrated Devices: BHARAT 101 Plus 4G - Bramha, BHARAT 101 Plus 4G - Bramha+, BHARAT 101 Plus 4G - Sarva, TS101 Plus 4G - Yantra, TS 101 Basic - EV, BHARAT 101 - IRNSS, TS 101 Advance, OBD II
Configurable Parameters: Telematics Device Type, Title, Input Topic, Output Topic
Know more about iTriangle Devices here: Coming Soon 😄
Connector Type: Input
Connector Category: Telematics Device
Description: Merging the best of telecommunications and informatics, the Teltonika telematics team offers unique ways to monitor vehicles, assets, and the workforce. From motorbikes to heavy trucks, from a single tool to warehouses – there is a solution for every client. Telematics is the key to effective resource management, optimization of costs and safety across a range of industries.
Integrated Devices: FMB130, FMB920, FMB001, FMB000, FMB010, FMB020, FMB110, FMB120, FMB122, FMB125, FMB140, FMB150, FMB202, FMB204, FMB209, FMB225, FMB230, FMB240, FMB641, FMB900, FMB910, FMB930, FMB965, FMC130, FMC234, FMC150, FMC650, FMC920, FMC003, FMC125, FMC13A, FMC225, FMC230, FMC800, FMC880, TFT100
Configurable Parameters: Telematics Device Type, Title, Input Topic, Output Topic
Know more about Teltonika Devices here: Coming Soon 😄
Connector Type: Output
Connector Category: Telematics Device
Description: HTTPs connector facilitates the integration between Condense and external systems over HTTPs (HyperText Transfer Protocol Secure) so as to securely transmit data from Condense to external systems or from external systems to Condense using HTTPs POST/GET requests.
Configurable Parameters: HTTPS URL, Input Topic, Kafka Consumer Group
HTTPS URL Description: The HTTPS URL where data is sent. Typically used for secure communication over the web. How to obtain: a) Determine the server endpoint where your application needs to send the data.
b) Ensure the URL starts with https:// to use secure communication.
c)Obtain HTTPS URLs from the web service documentation or API endpoint being used.
Kafka Topic Description: This key defines the name of the Kafka topic from which the connector will read data. It also serves as the reference point for recording the progress of the consumer within the topic (known as offsets). Offsets track which messages have already been processed, ensuring the connector doesn't re-consume them.
Connector Type: Output
Connector Category: Stream Connector
Amazon Official Docs: Kinesis Data Streams: https://aws.amazon.com/kinesis/data-streams/ Developer Guide: https://docs.aws.amazon.com/streams/latest/dev/introduction.html
Description: Amazon Kinesis Streams is a fully managed service for real-time data streaming at scale. It is used to collect and process large streams of data records in real time. The service allows developers to build applications that can continuously ingest and process large, streaming data in real time.
Configurable Parameters: Title, Stream Name, Region, Access Key, Secret Key, Input Topic, Kafka Consumer Group
Kinesis Access Key ID Description: The access key ID is required to authenticate and access AWS Kinesis. How to obtain: a) Go to the AWS Management Console.
b) Navigate to IAM (Identity and Access Management).
c) Select Users and then your user.
d) Choose the Security credentials tab.
e) Create a new access key or use an existing one.
Kinesis Secret Key Description: The secret access key is required to authenticate and access AWS Kinesis. How to obtain: a) Go to the AWS Management Console.
b) Navigate to IAM (Identity and Access Management).
c) Select Users and then your user.
d) Choose the Security credentials tab.
e) Create a new access key or use an existing one.
Kinesis Region Name Description: The AWS region where the Kinesis stream is located. How to obtain: a) Go to the AWS Management Console.
b) Navigate to Kinesis.
c) Check the region setting in the top-right corner.
Kinesis Stream Name Description: The name of the Kinesis stream to which data will be sent. How to obtain: a) Go to the AWS Management Console.
b) Navigate to Kinesis.
c) List and select the stream name from the dashboard.
Kafka Topic Name Description: This key defines the name of the Kafka topic from which the connector will read data. It also serves as the reference point for recording the progress of the consumer within the topic (known as offsets). Offsets track which messages have already been processed, ensuring the connector doesn't re-consume them.
Connector Type: Output
Connector Category: Stream Connector
Amazon Official Docs: Coming Soon
Description: Amazon SQS, short for Simple Queue Service, is a messaging service offered by Amazon Web Services (AWS). It acts as a temporary holding spot for messages between different software components. You can think of it like a waiting line where messages are delivered and then retrieved by another program for processing.
Configurable Parameters: Title, Queue Name or ARN, Region, Access Key, Secret Key, Input Topic
Connector Type: Output
Connector Category: Stream Connector
Google Official Docs: Pub/Sub Documentation: https://cloud.google.com/pubsub/docs
Description: Google Pub/Sub is a messaging service on the Google Cloud Platform that lets you send and receive messages between applications. Unlike Amazon SQS, Pub/Sub focuses on asynchronous messaging, meaning applications don't have to wait for a response after sending a message
Configurable Parameters: Title, PubSub Topic Name, Project ID, Service Account Key, Input Topic, Kafka Consumer Group
Kafka Topic Name Description: This key defines the name of the Kafka topic from which the connector will read data. It also serves as the reference point for recording the progress of the consumer within the topic (known as offsets). Offsets track which messages have already been processed, ensuring the connector doesn't re-consume them.
Pub/Sub Project ID Description: Available in the Google Cloud Console under your project’s details.
Pub/Sub Topic Name or Subscription Name for Input Connector Description: Obtain the topic name by creating a topic in Google Cloud Pub/Sub or selecting an existing one.
Pub/Sub Service Account Credentials Description: Create or obtain service account credentials from the Google Cloud Console under IAM & Admin > Service Accounts. Download the JSON key file for the service account.
Kafka Error Topic Description: Configured based on the Kafka topic designated for error handling in your system.
Connector Type: Output
Connector Category: Store Connector
MySQL Official Docs: MySQL Documentation: https://dev.mysql.com/doc/ Downloads: https://dev.mysql.com/downloads/
Description: MySQL is an open-source relational database management system (RDBMS). Think of it as a digital filing cabinet that stores information in a structured way.
Configurable Parameters: Title, MySQL Host, MySQL Database Name, MySQL Port, MySQL Username, MySQL Password, MySQL Table Name, MySQL Fields, Input Topic, Kafka Consumer Topic
MySQL Host Description: The server address where the MySQL database is hosted. How to obtain: a) Check the configuration files of your application.
b) Ask your system administrator.
c) Review hosting provider documentation.
MySQL Database Name Description: The specific database within the MySQL server you are connecting to. How to obtain: a) Query the MySQL server: SHOW DATABASES;
b) Check the configuration files of your application.
c) Ask your database administrator.
MySQL Port Description: The network port MySQL server is listening on, typically 3306. How to obtain: a) Check the configuration files of your MySQL server.
b) Use the command: SHOW VARIABLES LIKE 'port';
c) Ask your system administrator.
MySQL User Description: The username used to authenticate and connect to the MySQL server. How to obtain: a) Check the configuration files of your application.
b) Query the MySQL server: SELECT user FROM mysql.user;
c) Ask your database administrator.
MySQL Password Description: The password associated with the MySQL user account. How to obtain: a) Check the configuration files of your application.
b) Ask your database administrator. Note: For security reasons, passwords are usually not stored in plaintext and should be handled securely.
MySQL Table Name Description: The specific table within the MySQL database you are interacting with. How to obtain: a) Query the MySQL database: SHOW TABLES;
b) Check the configuration files of your application.
c) Ask your database administrator.
MySQL Fields Description: The columns or fields within the specified MySQL table. How to obtain: a) Query the MySQL table: DESCRIBE table_name;
b) Check the documentation or schema design of your database.
c) Ask your database administrator.
Kafka Topic Name Description: This key defines the name of the Kafka topic from which the connector will read data. It also serves as the reference point for recording the progress of the consumer within the topic (known as offsets). Offsets track which messages have already been processed, ensuring the connector doesn't re-consume them.
Connector Type: Output
Connector Category: Store Connector
Google Official Docs: Coming Soon
Description: InfluxDB is an open-source time series database specifically designed to handle data that changes over time.
Configurable Parameters: InfluxDB URL, InfluxDB Organization Name, InfluxDB Authentication Token, InfluxDB Bucket Name, Input Topic, Kafka Consumer Group
Connector Type: Output
Connector Category: Store Connector
Google Official Docs: Bigtable Documentation: https://cloud.google.com/bigtable/docs API Docs: https://cloud.google.com/bigtable/docs/apis Developer Docs: https://cloud.google.com/bigtable/docs/overview
Description: Google Cloud Bigtable is a fully managed, scalable NoSQL database service designed for large analytical and operational workloads. It is ideal for applications that require high read and write throughput and low latency.
Configurable Parameters: Title, Bigtable Project ID, Bigtable Instance ID, Bigtable Table ID, Service Account Credentials, Input Topic, Kafka Consumer Group
The Bigtable Connector facilitates the integration between Condense and Bigtable. This connector allows for efficient data ingestion from Kafka topics directly into a Bigtable instance, enabling real-time analytics and storage of streaming data.
Bigtable Project ID Description: The ID of the Bigtable project. How to obtain: a) Navigate to the Google Cloud Console
b) Select your project from the project selector drop-down.
c) The Project ID is displayed in the project info panel.
Bigtable Instance ID Description: The Instance ID of the Google Bigtable project. How to obtain: a) In the Google Cloud Console, go to the Bigtable Instances page.
b) Select your Bigtable instance to view its details.
c) The Instance ID will be visible in the instance summary.
Bigtable Table ID Description: The table ID of the Google Bigtable. How to obtain: a) Go to the Bigtable section in the Google Cloud Console.
b) Open your instance and navigate to the Tables section.
c) Select the table you want to use; the Table ID will be listed.
Service Account Credentials Description: The Bigtable service account credential in base64 format. How to obtain: a) Create a service account for Bigtable with appropriate roles (Bigtable Admin, Bigtable User).
b) Download the JSON key file for the service account.
Kafka Topic Description: The Kafka topic from which the data will be read to be written to the specified Bigtable instance.
Connector Type: Output
Connector Category: Store Connector
BigQuery Official Docs: Coming Soon
Description: BigQuery is a powerful data warehouse offered by Google Cloud Platform, designed for analyzing massive datasets.
Configurable Parameters: Title, BigQuery Table ID, BigQuery Project ID, Service Account Key, BigQuery Dataset ID, Input Topic, Kafka Consumer Group
Connector Type: Output
Connector Category: Store Connector
Cassandra Official Docs: Coming Soon
Description: Cassandra is an open-source, distributed NoSQL database designed for handling massive amounts of data across many servers. It excels at high availability, meaning it's always accessible and scales easily to handle growing data demands. Unlike traditional databases, Cassandra stores data in a flexible format ideal for large, constantly changing datasets.
Configurable Parameters: More Details Coming Soon
Connector Type: Output
Connector Category: Stream Connector
Event Hub Official Docs: Coming Soon
Description: Event Hub is a real-time data ingestion service from Microsoft Azure. It's essentially a high-speed data pipeline that can handle millions of events per second from various sources. Think of it as a central hub where devices and applications send their data streams. Event Hub buffers this data for later processing and scales up or down based on your needs. It integrates with other Azure services for further analysis and real-time decision-making.
Configurable Parameters: More Details Coming Soon
Connector Type: Output
Connector Category: Store Connector
MSSQL Official Docs: Coming Soon
Description: MSSQL, short for Microsoft SQL Server, is a powerful relational database management system (RDBMS) developed by Microsoft. Think of MSSQL as a secure and organized filing cabinet for your data, ensuring its accuracy and easy access for various applications.
Configurable Parameters: More Details Coming Soon