Data Monetization – Dataconomy https://dataconomy.ru Bridging the gap between technology and business Tue, 27 Apr 2021 11:32:21 +0000 en-US hourly 1 https://dataconomy.ru/wp-content/uploads/2022/12/DC-logo-emblem_multicolor-75x75.png Data Monetization – Dataconomy https://dataconomy.ru 32 32 How publishers use AI to balance personalized experiences with monetization strategies https://dataconomy.ru/2021/04/28/how-publishers-use-ai-personalized-experiences-monetization-strategies/ https://dataconomy.ru/2021/04/28/how-publishers-use-ai-personalized-experiences-monetization-strategies/#respond Wed, 28 Apr 2021 10:15:00 +0000 https://dataconomy.ru/?p=21946 Without a doubt, publishers are well-placed to harness the relationship with their audience – possessing the means to collect and build the strong first-party data sets required to deliver personalized experiences and power various revenue streams. But, as the industry moves further away from cookie-based targeting – and Google speaks out against alternative ID solutions […]]]>

Without a doubt, publishers are well-placed to harness the relationship with their audience – possessing the means to collect and build the strong first-party data sets required to deliver personalized experiences and power various revenue streams.

But, as the industry moves further away from cookie-based targeting – and Google speaks out against alternative ID solutions for cross-site tracking, limiting the ability to scale – publishers must find new ways of boosting and communicating the value of their inventory while also ensuring monetization strategies align with the user experience.

This is where artificial intelligence (AI) comes in, playing a key role in achieving this balance.

Opportunities aren’t disappearing; they’re just different

Today’s advertising landscape is increasingly complicated. Most digital ad spend goes to targeting and retargeting specific individuals, which relies on consistent visibility and computability of identity. Google’s move has put this approach on the ‘endangered’ list and will likely add to existing fragmentation. Constructing addressable identifiers was already difficult – with users spread across laptops, mobile, CTV, and other smart gadgets – but now, brands will also have to switch between different technologies and systems when using Google or the open web.

On the publisher side, this will affect personalization strategies as a means to deliver value, both from a content and advertising perspective. However, it also offers publishers an opportunity to play a more central role in providing access to addressable audiences for advertisers looking to optimize ad spend through content-rich experiences.

By using AI technology, publishers can facilitate the data onboarding process and match brands’ first-party data with their own addressable audiences with a higher accuracy rate than other non-AI tools. When applied in conjunction with cleanroom technology, this provides a privacy-safe and publisher-controlled space for data collaboration that matches audiences on a similarity-base, enabling incremental reach in private marketplaces.

AI offers a route to effective reach enhancement

The two core aces publishers hold are, of course, content and consent. Producing engaging content helps win user engagement and loyalty, while user-centric consent increases the chances of building trust and gaining permission to collect and use much sought-after first-party data. On this basis, publishers are in a good position to build on the foundation of their first party-data strategy to deliver basic reach for known, logged-in users.

The issue, however, lies with the limitations of consented data. Not all users will be willing to share data. In fact, it’s widely considered that just 2-10% of consumers share details such as age and gender.

To sustain optimal reach, publishers will therefore need to explore options beyond the log-in walls. Those keen to keep content as openly available as possible will likely turn to using the data processing and enhancement capacity of AI to build on first-party data strategies. High on the list of uses is predictive modeling, powered by machine learning. By taking consenting user attributes as an analytical base, it allows for the accurate extension of addressable reach – in line with customized and verifiable accuracy rates set by each publisher – even when deterministic data is lacking.

For instance, when used in tandem with real-time contextual data, AI can drive impression-level targeting without user-level data. With every use case, the main appeal is that an emphasis on inferred— not declared — characteristics keeps privacy front and center, enabling personalized experiences and targeting without hindering the user experience.  

An example of how this could work in the real world is with recruitment data. Publishers with recruitment advertising departments can harness tools to integrate data from job seekers to display highly targeted ads to relevant candidates. AI can then be used to scale reach, expanding the audience based on the initial recruitment data to reach other statistically relevant consumers without impacting the user experience.

What next for the industry?

Gazing into the collective industry crystal ball is never easy, but there are signs of which way the winds are blowing. For instance, the latest proposal to emerge from Google’s Privacy Sandbox initiative, FLoC, suggests the use of machine learning analysis to create a cohort-based approach to targeting.

For publishers previously wary of AI-assisted audience syndication, this could be good news: allowing them to build stronger ties with advertisers and pave the way to scale audiences. Setting aside the debate around whether or not FLoC will be anti-competitive, there is no denying that it will likely drive further development of machine-learned segmentation and personalization, which is a good move for the industry.

In a continuously changing industry, AI ultimately provides an opportunity for publishers to be optimistic about their ability to balance personalized experiences with privacy-first monetization strategies. The advanced solutions offered by AI empower publishers to forge their own path and equip them with the tools required to show that they are not merely providers of first-party data but lynchpin to scalable privacy-safe solutions.

]]>
https://dataconomy.ru/2021/04/28/how-publishers-use-ai-personalized-experiences-monetization-strategies/feed/ 0
The Top 5 Industries That Can Benefit From Data Monetization https://dataconomy.ru/2021/04/21/top-5-industries-benefit-data-monetization/ https://dataconomy.ru/2021/04/21/top-5-industries-benefit-data-monetization/#respond Wed, 21 Apr 2021 10:30:00 +0000 https://dataconomy.ru/?p=21928 Data monetization can be an effective tool for helping companies and sectors boost profits and keep consumers happy. Here are five of the industries that data monetization strategies could benefit the most. Music The music industry experienced a prolonged period of upheaval due in large part to streaming services’ popularity. The shift to streaming and […]]]>

Data monetization can be an effective tool for helping companies and sectors boost profits and keep consumers happy. Here are five of the industries that data monetization strategies could benefit the most.

Music

The music industry experienced a prolonged period of upheaval due in large part to streaming services’ popularity. The shift to streaming and away from physical music initially caught many industry executives unprepared, making them scramble to deal with the change.

They’re more accustomed to it now, especially since the COVID-19 pandemic helped more people get acquainted with streaming concerts. Grammy-winning artist Brandi Carlile held several ticketed live streamed concerts, with proceeds going to her crew members who were out of work due to the pandemic. Carlile also chose several charities to support with the generated income.

Perhaps an artist keeps track of the number of viewers for a live stream or the percentage of people who buy tickets several days or weeks before an event. Then, it’s easier to determine whether internet-based concerts could prove profitable.

Also, streaming service Spotify offers a significant amount of data to artists who use the platform. For example, musicians can see stream count updates of new releases for the first week of their availability. The number updates every two seconds to give artists accurate perspectives.

Spotify also shows how listeners come across tracks, whether by discovering them through playlist mixes or other means.

Automotive

Today’s automobiles are getting progressively more advanced, and that typically means they collect more data that companies can monetize.

Statistics also indicate that many consumers don’t mind if car manufacturers gather data from them. A 2020 McKinsey & Company study revealed that 37% of consumers would switch to car brands that offered enhanced connectivity.

One data monetization possibility is to track trends related to certain models, color choices, or other features in particular markets. Then, manufacturers could ensure dealerships have the cars that are most likely to sell.

A General Motors representative confirmed that the data it collects generally relates to a car’s location, driver behavior, and vehicle performance. However, they said that the company couldn’t link much of the data to particular individuals.

Brands aiming to roll out successful data monetization strategies should safeguard against privacy violations. If consumers feel companies know too much, they could show progressive unwillingness to use data-sharing features.

Retail

There’s growing interest in data monetization across industries. One study found that more than 91% of executives polled noticed increases in related investments. For example, if a company representative purchases a data analytics platform subscription, they could see insights that might otherwise get overlooked.

The retail sector is an industry with tremendous potential to benefit from data monetization. For example, a brand could track how many e-commerce shoppers redeem a discount code associated with a particular social media campaign. Such statistics help determine whether the effort got the desired results.

Alternatively, physical store data monetization could involve tracking the busiest shopping hours. Perhaps a manager realizes many people leave without buying after seeing crowded store areas or long lines. If so, the solution could be to staff more employees to cope with increased demand.

A typical data monetization challenge happens when brands collect too much information, and there is not enough time to analyze it thoroughly. Thus, retailers seeking to maximize their benefits should choose a few desired goals and determine what kind of data is most helpful in achieving them.

Healthcare

People in the healthcare industry are well-accustomed to using available data to make the most appropriate care decisions. A patient’s lab results or vital signs often dictate which treatments to provide and when. However, organizations can also use data to support profitability.

One example is to explore the issues behind missed appointments. When people don’t show up, that problem prevents a facility from opening the slot to someone ready and willing to take it. A closer look at the data might indicate that most no-show patients assert they did not know they had appointments scheduled.

A text message that automatically adds a person’s appointment to their digital calendar would reduce the issue. Additionally, a data monetization strategy may indicate that many patients could get the necessary care outside of real-time visits.

New Mexico’s Presbyterian Healthcare Services began using an asynchronous communication system several years ago. In 2020, staff members fielded 50,000 low-acuity care queries, each taking an average of two minutes to complete. Patients usually got responses to their text-based content within 15 minutes.

This approach highlights some possible metrics to track during a data monetization effort. For example, how long do patients wait for answers? What percentages of cases can providers tackle without in-person or video-based visits?

Marketing

Data monetization is already a common practice in the marketing sector. However, research indicates the trend will continue.

A January 2021 study indicated that 88% of marketers intend to prioritize gathering and storing first-party data. Although 58% of respondents considered it a high priority, 30% noted it was their utmost concern over the next 6-12 months.

However, the company that conducted the study indicated the growing importance of zero-party data. First-party data comes from customers’ interactions but often gets collected in the background. Zero-party data is information that those people intentionally give to businesses.

Monetizing data can improve marketing outcomes in numerous ways. Many companies look at data while planning campaigns or choosing which advertising channels to use for particular audiences.

Marketing professionals can also apply data analytics to determine which outreach methods will likely resonate most with specific audiences. While working out a strategy, company representatives should assess known challenges and how increased information could overcome them.

Data Monetization Makes Sense

These are some of the sectors most likely to profit from data monetization initiatives. However, other industries could see similarly positive outcomes, especially if representatives take care to ensure the data’s reliability.

]]>
https://dataconomy.ru/2021/04/21/top-5-industries-benefit-data-monetization/feed/ 0
AI, blockchain, and new ways for everyone to monetize their data https://dataconomy.ru/2021/03/10/ai-blockchain-new-ways-monetize-data/ https://dataconomy.ru/2021/03/10/ai-blockchain-new-ways-monetize-data/#respond Wed, 10 Mar 2021 12:09:37 +0000 https://dataconomy.ru/?p=21827 Breakthroughs in AI and innovations in applying blockchain for personal data control and monetization enable new ways to make money off of personal information that most people currently give away for free. Here we highlight three data science and business model innovations, starting with breakthrough ML technology that learns on the fly. This AI Bloodhound […]]]>

Breakthroughs in AI and innovations in applying blockchain for personal data control and monetization enable new ways to make money off of personal information that most people currently give away for free.

Here we highlight three data science and business model innovations, starting with breakthrough ML technology that learns on the fly.

This AI Bloodhound can hunt

There’s an emergent machine learning technology out there that offers a clever new way of finding and classifying unstructured content.

In geek-speak, the technology is a vertical, personalized search engine that doesn’t require expensive knowledge graphs.

In human speak, it’s a context-sensitive, human-in-the-loop search engine that uses search criteria and implicit user feedback to recommend high-quality results.

And in talk to me as you might a young child, or a golden retriever speak, it’s like an AI Bloodhound in that it is instantaneously trained by the “scent” of content to go find, score, and bring back content that has an implicitly similar “scent.”

In user experience speak, it’s like this:

  • Start a search session by using examples of what you’re looking for
  • Choose results you like to tell your personalized AI to “go find more results like this”
  • And alert me when you find me something good (if you’re out and about)

Unlike other ML applications, which need to be trained using high-quality content that can take weeks or months – this AI learns on the fly. Also, unlike other ML models, which are monolithic and have a reputation for being expensive and including things like “AI bias,” with this AI, pristine engines are instanted at each user’s search session level. There isn’t one ML engine – there are as many as you need. And because the ML engines are pristine when instantiated, there is no AI Bias. Instead, it’s the human-in-the-loop that introduces the “bias” when the human gives the personalized ML engine examples of the types of results it should be finding and bringing back.

One of the areas where this AI is being used is hiring and recruiting, within the context of a unique personal data monetization community.

Hey fam, the FAAAM oligarchs won’t be too happy about this

There’s an emergent community platform model in which members are given the ability to provide their personal information, configure its availability to others in the community, and then choose how to monetize it.

Members own all their data and determine which community actors benefit from it (unlike the tech giants who take your data, do who-knows-what with it while making money hand over fist, and then pay you little to nothing for it). 

Everything is stored securely thanks to blockchain technology, and only users (not the platform administrators) have the key to access it.

Opportunities are created for users to share their information with companies and other community members interested in purchasing it while also keeping that information safe and secure. AI expedites monetization by optimizing the matching of sellers and buyers.

An AI-powered staffing service that pays candidates to interview

One category of personal information that community members can choose to monetize is their resume. This is done via a Staffing Service “module” within the community and is powered by that AI Bloodhound technology.  Both sides of the supply-demand equation experience value:

The user experience (UX) for Staffing Company users or enterprise HR users:

  1. Create a new job or requisition and configure the usual fields like role name, description, key requirements, and so on.
  2. Drag and drop examples of resumes that the client or staffing company believes would be great for the role. Think of it as giving the AI resumes the likes of which the recruiter would hope to see from candidates.
  3. Click the Save button.

That req is now its own AI-powered search session. When it runs a search, its results will include candidates whose resumes were implicitly similar to those used in step #2 above. The AI Bloodhound technology found resumes with the same “scent” as those it was trained on.

As recruiters review the search results, they select those resumes that they feel the AI got right. “These are the types of candidates I was hoping to find!”. With those results selected, they click the Search button again. The AI engine powering that specific requisition becomes instantly re-trained using the resumes that the recruiter just selected (positive model reinforcement). After only a couple of these cycles, the AI search results will reliably show candidates whose resumes were given a high AI score because they were implicitly and contextually similar to what the recruiter explicitly told the engine they were interested in.

It’s important to note that high-quality search results that the recruiters see don’t include candidates’ names or contact information at first. You’ll see why in a moment.

The UX for community members looking for jobs:

  1. When candidates configure their resume settings, they set the dollar amount they want to be paid to reveal their contact information and take an interview with a recruiter.
  2. After a candidate’s resume is uploaded or set as Active, it is analyzed and scored by all of the AI engines powering all of the jobs/requisitions created by the Staffing Companies.
  3. A candidate’s resume will be given high AI scores by those requisitions whose AI engines were trained by example resumes implicitly similar to the candidate’s
  4. A candidate is notified when there’s a position for which they’re a good fit, and the recruiter has agreed to pay the fee configured in #1.
  5. The candidate then coordinates with the recruiter to discuss the opportunity.

Winner winner, chicken dinner

The hiring manager wins because the AI found the type of candidate they were looking for. They are happy to pay the candidate’s fee to avoid the time and frustrations involved with a keyword search.

The candidate wins twice: 1) They found a job opportunity that seems to be an excellent fit for them, and 2) They got paid while doing so.

And of course, everybody wins as more people are employed in jobs that are a good fit for them. Better quality of work. Less turnover. This can only help with getting the economy on a path for returning to pre-Covid levels.

Peer-to-peer digital marketplace for personal information: a sharing economy to dwarf all others?

Companies like Uber and Airbnb showed the world the power of “Sharing Economies” and the value that can be unleashed when you take an otherwise idle asset and make it available for utilization.

This data control and monetization community is like a sharing economy, except instead of putting physical assets into play, it puts members’ personal data and intellectual capital into play.

There are many more people on the planet than there are cars, homes and apartments, private rooms, and other properties. Every person has personal information. Therefore everyone stands to benefit.

Except, of course, companies with business models that depend on people giving them their personal information for free.

Co-author:

AI, blockchain, and new ways for everyone to monetize their data

As a serial entrepreneur, Erik has started a number of companies as the main inventor and technologist. Being a veteran of the tech industry, his passion for innovation and belief in the power of creativity has led to the success of several leading startups and high-growth technology companies. As Founder and CEO of ImagineBC, his current mission is to unlock data for more equitable outcomes for users of data. He has kickstarted a new data economy and, in doing so, worked to align data practices with growing consumer expectations.

Prior to launching his latest venture, Erik was at the forefront of the human capital management and technology intersection with his company Benepay Technology. Benepay Technology, the parent company of ImagineBC, remains operational and well-positioned. Coupled with the successful activation of Benepay Technology, Erik also founded and, later, sold PowerPay software for $22 million. Erik is also the president of the Data Union, a global movement aimed at creating a more equitable and ethical data economy.

Erik’s industry accolades include his appointment to the Forbes Tech Council, Rolling Stone’s Culture Council, and the Radical Exchange.

]]>
https://dataconomy.ru/2021/03/10/ai-blockchain-new-ways-monetize-data/feed/ 0
Why predictive marketing intelligence is the next frontier of data monetization https://dataconomy.ru/2021/01/29/why-predictive-marketing-intelligence-data-monetization/ https://dataconomy.ru/2021/01/29/why-predictive-marketing-intelligence-data-monetization/#respond Fri, 29 Jan 2021 12:25:45 +0000 https://dataconomy.ru/?p=21674 Clarification that Apple’s privacy changes will be stricter than expected is just the latest blow for digital publishers reliant on building strong audience relationships to drive their revenues. The media owner’s ability to keep monitoring audience activity and deliver personalized digital experiences has faced multiple challenges this year — including Google’s deadline on third-party cookie […]]]>

Clarification that Apple’s privacy changes will be stricter than expected is just the latest blow for digital publishers reliant on building strong audience relationships to drive their revenues.

The media owner’s ability to keep monitoring audience activity and deliver personalized digital experiences has faced multiple challenges this year — including Google’s deadline on third-party cookie support and the Federal Court of Justice ruling that legitimate interest isn’t a valid purpose for ad tracking. 

The fact that consent will be needed to trace users across iOS apps and browsers means cookie-generated data will fade at an even faster rate, along with the traditional mechanisms of profile building and message tailoring. Little wonder some industry pundits are arguing the wider technology ecosystem should start building arks to weather the ongoing deluge. 

Yet, the problem with looking to the arks developed by tech giants is that it only increases publisher dependence on them and their data. As the industry seeks alternatives to cookies, there’s an opportunity for the open web to leverage its own assets. 

Enter predictive marketing intelligence — the next frontier of efficient data monetisation.

Taking analytics to the next level 

There is a reason why Google has recently dialed up its predictive analytics capacity. In the post-cookie world where identifiers have less scope, marketers and publishers alike will need intelligent tools that enable them to make the most of accessible data. And as shown by the new updates to Google Analytics, the tech titan is keen to meet this need by offering AI-based solutions for first-party data processing and trend prediction – however, while retaining its power as the gatekeeper of data access and analytics. 

To give businesses a true competitive advantage, data tech innovation must aim at breaking down the barriers to data access and understanding, while fostering data independence.  It’s about empowering businesses to expedite processes, make better decisions and seize opportunities by unlocking the value embedded in the organization. As the industry strives to build a more transparent and fairer data supply chain, these transformative initiatives will be crucial to allow publishers to better understand audiences, but also use first-party information to efficiently build audience segments, in a privacy-compliant way. 

The next generation of marketing intelligence platforms  

Publishers are, of course, familiar with the idea of tapping first-party data to tackle targeting issues but doing so isn’t necessarily easy. 

While they recognise the huge potential of their rich audience information, it’s frequently too disparate and disorganised to put into action. The causes of this chaos are diverse. Because users interact with content in infinitely varied ways, data about their activities flows in an unstructured tide. The mixed audiences that sites attract — usually a blend of logged in and anonymous users — also mean data is often incomplete; covering known attributes about some opted-in individual and fragments of information about others. 

The upshot, however, is that most publishers are left grappling with fragmented data that makes it difficult to produce the holistic individual-level view needed for accurate profiling; particularly if internal data capabilities are minimal. Fortunately, the new breed of marketing intelligence platforms that leverage predictive analytics presents an answer to many of these problems.

In addition to coordinating dis-jointed data, predictive analytics tools can help publishers turn it into a viable basis for accurate, impactful, and privacy-led marketing. 

Super-charging segmentation with data quality 

By deploying smart algorithms to analyse on-site activity, advanced analytics tools produce instant insights into user habits, needs and interests — allowing publishers to augment individual profiles and create granular segments in real-time. As a result, they will be able to offer refined yet cookie-free audience-segments fuelled by reliable quality data; a competitive advantage that’s especially important when 35% of marketers are losing vital budget to inaccurate targeting.  

Moreover, using persistent analytics that draw on incoming and historic data can also anticipate what users are likely to be interested in next —in terms of content and brand messaging — giving publishers the means to optimise the value of their assets on all sides. 

Amplifying audience insight 

Publishers can even combine known user characteristics with smart modeling techniques to learn more about their overall audience, anonymous users included. Where previous lookalike models have typically required cumbersome data processes, often taking weeks to run — not to mention relying on tech heavyweight’s ‘black boxes’– emerging platforms are capable of real-time, streamlined, and transparent automation. 

Going a step further, equipped with powerful insight gleaned from their first-party data, publishers can further extend audience knowledge by exploring data enrichment and exchange solutions in partnership with marketers. By combining the variety of data points – first-party data, behavioural, CRM and data gathered from connected devices to name a few – publishers can better understand their audiences and identify more accurately potential new segments. With a clearer picture of complete audience trends and preferences, publishers will then have the means to create even more tailored content and bring 100% of their users within targeting range.  

Publishers already have the key ingredient needed to keep the digital marketing system operating and maintain crucial revenue; they just need to apply it more effectively. With the progression of predictive marketing analytics has come a wider world of first-party data possibilities; including the potential to address long-standing issues with unruly audience information and unlock the rich insights it contains. For smart publishers, the next best move will be to embrace the new frontier of data monetization with advanced marketing intelligence and stronger data independency. 

]]>
https://dataconomy.ru/2021/01/29/why-predictive-marketing-intelligence-data-monetization/feed/ 0
Data is the New Dollar: Turning Data Into Business Profit https://dataconomy.ru/2016/01/26/data-is-the-new-dollar-turning-data-into-business-profit/ https://dataconomy.ru/2016/01/26/data-is-the-new-dollar-turning-data-into-business-profit/#comments Tue, 26 Jan 2016 09:30:08 +0000 https://dataconomy.ru/?p=14723 Every business knows that their data, from customer demographics and buying behavior to production insights, holds tremendous value. But, historically, most have considered only its internal worth—the insight it provides to improve operational efficiency, deliver a better customer experience, enhance products and services or save money. But now, a new perspective on data is emerging. Not only is […]]]>

Every business knows that their data, from customer demographics and buying behavior to production insights, holds tremendous value. But, historically, most have considered only its internal worth—the insight it provides to improve operational efficiency, deliver a better customer experience, enhance products and services or save money.

But now, a new perspective on data is emerging. Not only is the data valuable for internal purposes, but it can even be sold by companies who are establishing a data-as-a-service model, and the examples are quite surprising. Businesses of all types are exploring the market potential of their data, considering ways to productize, package and profit from it—how they can literally turn their data into dollars – within the security, privacy and regulatory guidelines.

What can your data do for you?

The ability to collect and analyze rich, dynamic data—made possible through cloud computing, massive data warehousing systems and business intelligence solutions—could turn companies many would never consider as being in “the data business” into potentially lucrative purveyors of information.

A toothbrush manufacturer, for example, knows a tremendous amount about toothbrush consumption, including how often tooth brushes are replaced and which SKUs sell best in specific geographic regions or to certain demographics. This unlikely “data business” could package and sell this insight to retailers, who would pay to obtain this insight, to help them optimize SKUs, determine which products will most likely succeed in one location versus another, or to help them turn over inventory faster.

As another example, medical laboratories conduct millions of screenings daily, providing accurate, timely results back to health care providers and consumers. These labs could anonymously analyze that data to uncover patterns and insights that would be extremely valuable to both health care organizations seeking ways to reduce and prevent disease and to pharmaceutical companies developing and marketing new medications. The data might also reveal that specific patient groups obtain screening exams at certain times during the year, data that physicians and hospitals could use to market their services to the right patients at the right time.

A “perfect storm” for data-driven business opportunities

We’re seeing the trend toward uncovering the profit potential within data stores across many business sectors. Companies like Uber, Airbnb and FitBit all provide an interesting product or service, but their true value lies in the data they collect and leverage: insight about our commute schedules, travel habits, physical activity levels and overall health. And the potential value of this data is being amplified by a perfect storm of factors:

  • Mobile devices—We can now collect, track and analyze data wherever and whenever consumers go.
  • Internet of Things—both wearables and machine sensors have become ubiquitous, low-cost data-capturing juggernauts.
  • Global perspective — Our communications—and the data collected over the platforms we use(Facebook, WhatsApp, Twitter, etc.)—are truly global in nature.
  • Cloud/SaaS platforms—no longer are our data efforts confined to the premises, behind-the-firewall. Instead, our capacity to collect, analyze, share and visualize data from many diverse sources seems infinite.

Dealing with the data deluge

Not only do we have access to more data than ever before, it also has the potential to be exponentially richer in context and meaning. It is attributable directly back to its source, where we can compare and contrast it with other attributes and characteristics. But, in order to uncover the richness of data and derive its monetary value, companies must:

  1. Gain access to complete data, integrated from a wide variety of sources, formats, languages and protocols.
  2. Effectively manage the data to ensure its quality. This includes verifying, cleansing, harmonizing, and storing it properly for analysis.
  3. Analyze the data, for which there is no shortage of algorithms and options, almost all of which require in-depth analytical skills and talent.
  4. Use business intelligence tools to visualize and gain actionable insight. This relies on a significant assumption that the data is complete, managed properly and analyzed accurately. Otherwise, it’s garbage in, garbage out.
  5. Assure data security and regulatory compliance across all of these steps.

There are a multitude of technologies and approaches available to help enterprises achieve these objectives. But even with these tools, companies still struggle to derive the full value from their data for their own operational insight, much less for taking advantage of new revenue-generating opportunities. In fact, most companies are analyzing only 12 percent of their data. Even more disheartening, the ROI on Big Data projects is abysmal, with companies expecting an average of $3.50 return per $1 invested, but realizing just $0.55. Not only is the ROI lackluster, one-third of business leaders say they’re so uncertain about the accuracy of their analysis, they don’t trust it to make decisions.

The failure lies in the overwhelming complexity of dealing with multiple tools and processes. Most traditional approaches segregate data integration and data management as two separate endeavors,despite their interdependency. And, most integration efforts focus on applications—making disparate software work together—leaving data integration as somewhat of an afterthought, often resulting in even more siloed, dirty or duplicated data. The situation is compounded by the fact that most solutions are self-service. This leaves companies to do the highly technical and complex integrations themselves -while under immense pressure to move with speed and agility to stay ahead of the competition—and to stay within a cost-constrained budget.

Given these obstacles, it’s no surprise that it’s nearly impossible for companies to gain the visibility and deep insights they need to serve their internal needs, let alone uncover opportunities to monetize their data.

dPaaS is the answer

As companies demand more cohesive, integrated solutions to generate data-driven revenue, a new approach to data management is bringing all of these pieces together in a much simpler, tightly integrated way. dPaaS, or Data Platform-as-a-Service, puts data at the center of the process, resolving the inherent problems of data quality, harmonization and integration, in a fully managed environment. The dPaaS difference lies in it’s unique approach that combines:

  1. A multi-tenant cloud platform, which provides greater agility, resources and scalability to handle the growing complexity and quantity of data.
  2. A single, unified platform that eliminates the piecemeal approach and simplifies integration and data management.
  3. Full visibility into the data—with complete tracking to see exactly where it’s come from, how it was modified, what was analyzed, how it looks today and where it’s going—instead of a black-box solution that leaves analysts in the dark.
  4. Simple API management that allows businesses to do whatever they want with their data—construct packages, apply algorithms and insights, visualize, etc.—with complete flexibility.

In short, with dPaaS, companies can get down to the business of extracting value from their data, rather than spending so much time wrestling with it. With faster, more efficient time to insights, companies that never considered themselves a data purveyor, can open an entirely new revenue stream to maximize the value of—and monetize—their data.

Like this article? Subscribe to our weekly newsletter to never miss out!

]]>
https://dataconomy.ru/2016/01/26/data-is-the-new-dollar-turning-data-into-business-profit/feed/ 1