Learn how to get ahead of the narratives that can impact your organization in our new guide.
Download Guide >
Zignal Labs utilizes leading technologies to gather and analyze massive amounts of data. Our platform is driven by an open and extensible technology stack that utilizes some of the most innovative technology available today.
"Everybody here is great. Everybody here is smart. Everybody is motivated. They’re passionate. They like what they’re working on. They’re curious and engaged."
"In my four years at Zignal, I have seen both the company grow and have experienced professional growth. Zignal gave me the opportunity to change career paths from Engineering to Product Management."
"I love the variety of media data sources and types that I get to work with at Zignal Labs, especially when I'm able to tie together insights across sources in a scalable way for all of our clients."
"The product we built at Zignal Labs is something that has never been done before."
We build products on the cutting edge of technology to deliver narrative intelligence in real time. We invest time and resources in delivering beautiful displays you can't help but gawk at. And we have a lot of fun doing it.
What we practice
We work towards a unified goal of continuing to make our products exceptional and our customers happy.
We’re engineering with a driven, creative, entrepreneurial mindset
We prototype, test, adjust and deliver revolutionary solutions to our clients
We create challenging solutions to some of today’s most complex data-driven problems
We work towards a unified goal of continuing to make our products exceptional and our customers happy
We have fun and work as a team to constantly improve an already great product
What we use
We use some of the most innovative, extensible, open source distributed technologies available today to power our big data, high volume, and real-time analytics platform.
Scala is our primary back-end language and was used to develop most of our core business critical applications.
Spark is an amazing in-memory cluster computing framework that allows us to quickly process data in near real-time.
We utilize Node.js to Interface with other backend components and build application API’s. Node is lightweight and efficient and is perfect for our data intensive real-time applications.
Data Visualization – Interactive data visualization technologies like D3.js, Three.js and Highcharts create some of the most appealing parts of our products.
NLP is used within our dashboards to enhance our analytics and help our clients understand and condense their data in an automated way.
We use React.js to keep our application performant, beautiful and responsive across all devices.
Elasticsearch serves data to the front end. Elasticsearch is a Data serving layer for text search queries which drives our front end.
We use Docker to run our underlying infrastructure and automate.
We utilize Apache Storm For realtime data enrichment, analytics and real time data delivery to the front end.
Apache Kafka for real-time analysis and rendering of massive amounts of streaming data in different phases of our data ingestion pipeline.