Media – Dataconomy https://dataconomy.ru Bridging the gap between technology and business Thu, 09 Jan 2025 11:08:23 +0000 en-US hourly 1 https://dataconomy.ru/wp-content/uploads/2022/12/cropped-DC-logo-emblem_multicolor-32x32.png Media – Dataconomy https://dataconomy.ru 32 32 The Facebook Exodus: Why I’m Leaving and Why Expert Verification Matters More Than Ever https://dataconomy.ru/2025/01/09/facebook-exodus-expert-verification-matters/ Thu, 09 Jan 2025 11:08:23 +0000 https://dataconomy.ru/?p=63180 Mark Zuckerberg just dropped a bombshell. Meta, the parent company of Facebook and Instagram, is abandoning its professional fact-checking program. Instead, they’re moving to a “community-driven” system, putting the onus on users to determine what’s true and what’s not. Zuckerberg says it’s about fostering “free speech,” but it feels a lot like abdicating responsibility, saving […]]]>

Mark Zuckerberg just dropped a bombshell. Meta, the parent company of Facebook and Instagram, is abandoning its professional fact-checking program. Instead, they’re moving to a “community-driven” system, putting the onus on users to determine what’s true and what’s not.

Zuckerberg says it’s about fostering “free speech,” but it feels a lot like abdicating responsibility, saving money, bowing down to political pressure, and more.

Frankly, it’s the last straw. I’m done with Facebook.

I’ve been wrestling with this for a while now. The endless scroll, the monetization of my life, the performative outrage, the nagging feeling that I’m being manipulated by algorithms, the blatant and widely covered manipulation… it’s exhausting. But this latest move? It’s a dealbreaker.

Look, I get the appeal of crowdsourcing. The wisdom of the crowd, right? But when it comes to complex issues, “common sense” isn’t always enough. We need experts. We need evidence. We need nuanced analysis, not just knee-jerk reactions and confirmation bias.

Zuckerberg, in his infinite wisdom (read: with a healthy dose of self-preservation), has decided to throw his fact-checking partners under the bus. Possibly, those annoying truth-tellers were just too good at their jobs, exposing uncomfortable truths and generally making life difficult for the Facebook overlords.

According to Zuck, these fact-checkers were “too politically biased” and, get this, “destroyed more trust than they created.” It’s a classic case of blaming the Messenger, wouldn’t you say?

Of course, the fact-checking organizations themselves aren’t taking this lying down. They’ve fired back, pointing out the obvious: they simply flagged potentially false content. What Facebook chose to do with that information was entirely up to them.

It’s a bit like a chef blaming the health inspector for a dirty kitchen. “Oh, those inspectors are just too picky! They’re ruining my reputation!” Never mind the fact that the kitchen’s a mess and the menu is probably giving people food poisoning.

Take climate change, for example. The science is clear, yet misinformation runs rampant on social media. Do we really want the veracity of climate data determined by a popularity contest? Or how about public health? Anti-vaccine sentiment is already a serious problem, fueled by conspiracy theories and misleading claims. Letting those narratives go unchecked – or chosen to be true by coordinated consortiums of community members that have an agenda and a vote – could have devastating consequences.

This isn’t about censorship. It’s about accountability. Social media platforms have a responsibility to ensure the information they disseminate is accurate and trustworthy. They’ve become our primary source of news and information, and with that power comes a responsibility to combat the spread of harmful falsehoods.

So where do we go from here? I, for one, am turning to platforms and tools that prioritize expert verification and rigorous fact-checking. Solutions like Factiverse, for example, which leverages a network of over 350k human-performed fact-checks from more than 100 trusted outlets globally to analyze information and provide context.

Factiverse’s approach gives me hope and gives me the tools to see which sources back up and contest a statement, so I can be informed and balanced. It’s a reminder that truth still matters, and that there are people out there dedicated to upholding it. In a world where facts are increasingly contested, we need reliable sources of information more than ever.

Maybe Zuckerberg’s gamble will pay off. Maybe the “wisdom of the crowd” will prevail. But I’m not sticking around to find out. I’m logging off Facebook and investing my time in platforms that value truth and accuracy. Because in the end, facts matter. And we all deserve better than to be drowning in a sea of misinformation.

This article was originally published on Hackernoon and is republished with permission.

]]>
First Speakers Announced for Data Natives 2018, The Tech Conference of the Future https://dataconomy.ru/2018/08/09/data-natives-first-speakers-announced/ https://dataconomy.ru/2018/08/09/data-natives-first-speakers-announced/#respond Thu, 09 Aug 2018 15:01:58 +0000 https://dataconomy.ru/?p=20194 Preparations are well underway for the 2018 edition of Data Natives– the data driven conference of the future, hosted in Dataconomy’s hometown of Berlin. Data Natives brings together a global community of data-driven pioneers to explore the technologies that are shaping our world- from big data to blockchain, from AI to the Internet of Things. […]]]>

Preparations are well underway for the 2018 edition of Data Natives– the data driven conference of the future, hosted in Dataconomy’s hometown of Berlin. Data Natives brings together a global community of data-driven pioneers to explore the technologies that are shaping our world- from big data to blockchain, from AI to the Internet of Things.

On the 22nd & 23rd November, 160 speakers and 1,600 attendees will come together to explore the tech of tomorrow. As well as two days of inspiring talks, Data Natives will also bring informative workshops, satellite events, art installations and food to our data-driven community, promising an immersive experience in the tech of tomorrow.

The theme of this year’s conference is quickening. Technology is developing at an unprecedented rate, and the pace of life is advancing with it- we all know that famous statistic that more data was produced in 2017 than in the previous 5,000 years of human existence combined. But what does this accelerated pace mean for businesses, for scientists, and for each and every human? How can enterprises catch up to tech landscape that’s accelerating at an unprecedented speed? How can we best use the influx of data to fuel the breakthroughs that will shape our scientific landscape? How can we adapt to life at an accelerated pace- and is there any way to switch off and slow down?

Today, we’re pleased to announce the first seven speakers who will join us in Berlin to explore these questions, and provide insights that will allow us to catch up to an accelerated world.

First Speakers Announced for Data Natives 2018, The Tech Conference of the Future

BART DE WITTE- Director of Digital Health DACH, IBM & Chair Faculty of Digital Health, futur/io Institut

Bart De Witte has over 20 years’ experience at the intersection of health, technology and business, working for companies including SAP, and is currently overseeing IBM’s Digital Health Unit. As IBM’s Director of Digital Health DACH, Bart is working on transforming health solutions in the DACH region, using IBM’s Watson Health to aid in this mission. When he’s not working with one of the world’s most sophisticated data platforms, he chairs the Faculty of Digital Health for futur/io, a new education and research institute focused on developing health systems which are inclusive and accessible to all.

ANNINA NEUMANN- VP Data Technology, ProSiebenSat.1 Media SE

ProSiebenSat.1 is the leading German entertainment company, reaching 45 million households, via web, mobile, smart TV, apps and social media. Unsurprisingly, such a vast company produces a vast and challenging array of data- and Annina Neumann is the person responsible for using this data to provide insights and build data-driven software components. For Data Natives 2018, Annina will be sharing her unique insights into AI in business applications.

NOA TAMIR- Data Science Team Lead, Babbel

A master of all trades, Noa Tamir has a diverse background in physics, economics, design and management. She has used her unique and multi-disciplinary approach to data science and King.com, and is currently the Data Science Team Lead at Babbel. When she’s not building Babbel’s Decision Making Data Science team, she also organises the R-Ladies Berlin Meetup.

DANIEL MOLNAR- Data Engineer, Shopify

Daniel Molnar has 19 years’ experience startups, 9 years’ experience in data, and has worked for leading companies including Microsoft and Zalando. This year, Daniel became the Director of Engineering for Shopify, an e-commerce giant who empower virtually anyone to open up their own online store. With clients including Red Bull, Nestle and Kylie Cosmetics, Shopify have proven themselves to be titans of ecommerce- with the hordes of data to prove it. We look forward to Daniel’s insights in wrangling such massive data sets.

First Speakers Announced for Data Natives 2018, The Tech Conference of the Future

STEWART ROGERS- Director of Marketing Technology, VentureBeat

VentureBeat is a publication known by almost everyone in the tech community- and their Director of Marketing Technology, Stewart Rogers, is similarly well-known in tech publishing and beyond. A self-described “journalist, analyst, founder, speaker, and digital nomad”, Stewart is currently the Director of Marketing Technology for VentureBeat and VentureBeat Insight, BV’s analysis and reports resource. With over 25 years’ experience in the IT and software industries, Stewart has used his unique insights into the developments in software and IT past, present and future to contribute to numerous publications and events.

JOHANNES STARLINGER- Health Data Researcher, Charité – Universitätsmedizin Berlin

Johannes Starlinger is a computer science postdoc and MD doing research in biomedical informatics at Charité and the research group for Knowledge Management in Bioinformatics at the Department of Computer Science at Humboldt-Universität zu Berlin. Combining his previous research and studies in Medicine and Computer Science, Johannes is currently focusing on service-oriented architectures in the healthtech sector, using knowledge mining and representation, similarity search, and predictive analytics.

ELENA PETROVA- Data Scientist, Auto1 Group

Elena Petrova is a Data Scientist for Auto1, Europe’s leading car trading platform. As part of her work for Auto1, Elena develops supervised machine learning models and Python packages for data processing and feature engineering. As part of Data Natives, Elena will be sharing her insights into unit testing for data science applications.

Data Natives will take place on the 22nd & 23rd November at Kühlhaus Berlin- for tickets and more information, please visit the Data Natives website.

]]>
https://dataconomy.ru/2018/08/09/data-natives-first-speakers-announced/feed/ 0
Can Smart Data Save TV Broadcasting? https://dataconomy.ru/2015/04/07/can-smart-data-save-tv-broadcasting/ https://dataconomy.ru/2015/04/07/can-smart-data-save-tv-broadcasting/#respond Tue, 07 Apr 2015 12:57:59 +0000 https://dataconomy.ru/?p=12564 A recent paper published by GfK sheds light on the state of the audience data market, and outlines a clear direction for the future of data in an internet-enabled TV world, while exploring the benefits of Big Data for broadcast. GfK, the Society for Consumer Research) Germany’s largest market research institute, and the fourth largest […]]]>

A recent paper published by GfK sheds light on the state of the audience data market, and outlines a clear direction for the future of data in an internet-enabled TV world, while exploring the benefits of Big Data for broadcast.

GfK, the Society for Consumer Research) Germany’s largest market research institute, and the fourth largest market research organisation in the world, brought forth a new report last week, titled, ‘Big Questions, Big Answers: Will harnessing smart data for audience analytics save the broadcast industry?’

“The potential offered by Big Data is immense. Currently, everybody is engaged in data experimentation and there is a lot to fight for,” notes the Global Lead of the Media and Entertainment Industry at GfK, Mr. Niko Waesche.

“There are some key questions and challenges around the use of Big Data, especially within a rapidly evolving TV consumption model. Our latest paper looks at how Big Data is being used in the real world specifically in our industry, and where Big Data will be headed in years to come,” he added.

For this report, insights came from respondents that included key decision makers and executives from 14 media groups spanning companies that cater to more than 70 million subscribers and deliver content that reaches nearly a billion people every day. Astro, Channel 4, Digicel, Genius Digital, GfK, Liberty Global International (LBI), Magine TV, maxdome, Orange France, OSN, Sky IQ, UFA, Verizon and Viacom International Media Networks (VIMN) were the participating broadcasters.

“Capturing and acting on behavioral data requires different capabilities and analytics from a purely asset-based approach. Broadcasters and cable providers should assess their capabilities now if they seek to remain competitive,” explained Tom Weiss, CEO of Genius Digital, a partner company of GfK.

Salient findings of the report are:

  • Owing to the “diverse and increasingly demanding” TV audience operators are shifting from gleaning asset-based data towards behavioral insights to better understand and respond to the needs of content curators and advertisers.
  • Behavioral data is key. Behavioral data unlocks new insights by capturing the who, what and when of the viewer, and places it in context alongside more traditional asset based data such as plays and subscribers.
  • Suitable infrastructure helps ensure “intelligent transformation and interpretation,” of troves of behavioral and asset based data, enabling a better understanding of the audience and emerging trends.

The report in its entirety is available here.

Photo credit: Lubs Mary. / Foter / CC BY-NC-SA

]]>
https://dataconomy.ru/2015/04/07/can-smart-data-save-tv-broadcasting/feed/ 0
The Digital Future of Media & Journalism https://dataconomy.ru/2015/02/17/the-digital-future-of-media-journalism/ https://dataconomy.ru/2015/02/17/the-digital-future-of-media-journalism/#respond Tue, 17 Feb 2015 14:57:02 +0000 https://dataconomy.ru/?p=12084 Elina Makri is the co-founder of oikomedia.com, a networked digital platform designed to trace and connect journalists, fixers and media professionals around the globe. She’s also the Greek editor of dialoggers.eu (a Greek-German collaborative journalism project, and the founder of the Youth Investigative Journalism prize, which aims to train and reward budding data journalists. Thus, […]]]>

Elina MakriElina Makri is the co-founder of oikomedia.com, a networked digital platform designed to trace and connect journalists, fixers and media professionals around the globe. She’s also the Greek editor of dialoggers.eu (a Greek-German collaborative journalism project, and the founder of the Youth Investigative Journalism prize, which aims to train and reward budding data journalists. Thus, she’s uniquely placed to discuss the future of journalism; which, from her perspective, is digitised, data-driven and transnational. We spoke to Elina recently about her work, and her thoughts on the future of the ever-evolving media industry.


 To begin with, tell us a little more about Oikomedia.

Oikomedia.com is a platform (a very targeted social network) that helps journalists trace other journalists, media fixers, photographers, cameramen, sound engineers etc around the world, in order to collaborate or exchange views, ideas, expertise. It is a very targeted social network for media professionals.

The idea behind oikomedia is the following: media companies and professionals (local journalists, cameramen, fixers, photographers, etc) need to be able to trace other professionals quickly, but also with a degree of accuracy (based on location, speciality, and previous experience).

Think about a journalist in Oslo who has to leave today because of an uprising in Lebanon. He has to find quickly someone trustworthy who can help him, once in Beirut, have a general overview of the uprising, contact people, translate interviews, find interesting stories.

With Oikomedia, he has just to log in (in all probability through his mobile phone), do an advanced search: =>‘beirut’, =>‘fixer’, => ‘english speaking’, =>‘previous demonstrations covered’), have a look at the portfolio of 5 or 6 local journalists, contact directly a couple of them and wait for a direct answer.

The Digital Future of Media and Journalism

Why do you think there’s such a demand for services like Oikomedia?

I connected with the other Oikomedia founders precisely because there was such a demand. I met Gianluca, my Italian partner, because he was a fixer for quite dangerous reporting in Italy. Without him, foreign media could not “penetrate” into the situation. When his reporting was published, Coca-Cola cancelled all the contracts in Southern Italy. There are so many media freelancers out there that need to find partners and set up projects. Plus, we take no commissions from those collaborations or any by-products, we just offer the platform. Moreover, we will soon release the virtual bureaus: Why maintain an expensive local newsroom / bureau when you can have a global + digital, cost effective and customized network of media professionals when and wherever you need them?

Oikomedia is the answer to an increasing demand. This demand is a direct consequence of the global financial crisis, as well as the challenges presented from new models of reporting, and an effort to harness new business models that pay for the news, big news agencies, broadcast providers. Newspapers are cutting more and more of their foreign correspondent offices, and slashing foreign coverage at an alarming rate, despite the fact that readers demand for news is exploding.

In a pragmatic approach, very often media companies (MARKET) have no extra money and time to send their staff to foreign countries (NEED). They don’t even have time to vainly search Internet directories in order to get stories and ideas in unknown languages (NEED).

You’ve also established hackathons and an award for data journalism; why do you think data journalism has become so important?

Four words: Explosion of available data. This data gives context to journalists’ stories. Data married to narrative structure and expert human knowledge can tell us a lot about our ever-changing world, and can provide checks and balances to a democratic society. Also, I have kept the expression of David Livingstone, the director of the New Jersey Trauma Center at University Hospital in Newark (from a ProPublica story): “In the absence of real data, politicians and policy makers can do what the hell they want.”

So, data journalism can be a powerful tool for:
1. Data control
2. Access and analysis on the information
3. New kinds of reporting with citizens’ participation. A method that can actually build the next generation of civic infrastructure by empowering citizens.

What do you consider to be the biggest changes the digital age has brought about it in the media?

1. Tectonic shifts on the business side- we’ve all felt the importance of this. It’s actually a matter of life and death for a media organization. We are no longer sure if the news industry, as such, exists. Print media bleeds red ink.
2. New ways of storytelling: very, very compelling multimedia storytelling. Should I refer to the post snowfall era?
3. Greater accountability within journalism. Many journalists are afraid of robowriting. I am not. I definitely believe that the work of the journalist after the digital tsunami has been upgraded.
4. More freedom: on-demand access to content anytime, anywhere, interactive user feedback, “democratization” of the creation, publishing distribution and consumption of content. Paradise!

What do you foresee in the future of media and journalism in the digital age?

Absolutely better journalism made by new means…for the ones who will survive. Data will become-if it’s not already- a strategic resource for media. The digital age has provided tools to the people (not only to journalists) to control authorities, has forced governments to adopt “by default open data” acts as is the case for the Greek government and has provided metrics for impact. I wouldn’t worry much for the business models.On the other hand, we should be cautious due to the surveillance mechanisms: who has access on our data, and for what reason?


(Image credit: Galymzhan Abdugalimov, via Unsplash)

]]>
https://dataconomy.ru/2015/02/17/the-digital-future-of-media-journalism/feed/ 0