Drductive Inc
Deductive is a global consulting firm providing data consulting and engineering services to companies that want to build and implement strategies to put data to work. We work with primary data generators, businesses harvesting their own internal data, data-centric service providers, data brokers, agencies, media buyers and media sellers.
- (415) 843 1774
- 145 Marina Boulevard
San Rafael, CA 94901
United States
Filter Results By:
Products
Applications
-
product
Data Engineering
Our team has deep experience in developing and optimizing the critical parts of a media company's data infrastructure, including onboarding, pipelines, warehousing and reporting.
-
product
Data Matching Services
Most datasets have limited value in isolation but significant benefits when combined with other datasets. We have worked extensively with match partners such as LiveRamp, Experian, and 4info to connect our customers' datasets and also develop technology for clients to directly match datasets to their in-house device graphs.
-
product
Data Publishing Services
Deductive builds and manages custom data publishing solutions for data providers. We have built real time data publishing platforms for Fortune 500 companies. Data publishing platform construction typically involves some or all of the following processes:
-
product
Panel Service
NRP
Some customers have licensed a granular data set and want to use that data set to measure behaviors of a national audience.
-
product
Attribution Pro
Attribution Pro is a toolset that brands, agencies and media sellers can use to create, track, optimize and analyze cross-platform attribution campaigns, either in-house or as part of a managed service.
-
product
Data Science Consulting
Our data science team has core competencies in data normalization and fusion, identity matching, audience indexing, attribution and prediction. We have developed a set of processes based on our experience in handling huge TV, digital and media data sets.
-
product
Data Generation Service
Many of our customers have existing raw datasets that they wish to leverage internally or with third parties. We have developed pipelines to collect, process, and normalize these datasets. This work typically includes a deep dive into the actual data available, its reliability and accuracy, followed by the development of a pipeline to process the data. We implement quarantine for wrong datasets and monitoring and alarms to be sure of data quality at all times.