ford – Dataconomy https://dataconomy.ru Bridging the gap between technology and business Thu, 25 Jan 2024 09:14:13 +0000 en-US hourly 1 https://dataconomy.ru/wp-content/uploads/2022/12/cropped-DC-logo-emblem_multicolor-32x32.png ford – Dataconomy https://dataconomy.ru 32 32 Ford Explorer recalls to affect almost 2 million cars https://dataconomy.ru/2024/01/25/ford-explorer-recalls-2-million-cars/ Thu, 25 Jan 2024 09:14:13 +0000 https://dataconomy.ru/?p=47623 According to the latest news, Ford Explorer recalls will affect 1.9 million cars, which means over 1 million car owners will have to give their cars back to the factory. This action comes after discovering a potential hazard where a piece of trim might come off the car. The specific part in question is the […]]]>

According to the latest news, Ford Explorer recalls will affect 1.9 million cars, which means over 1 million car owners will have to give their cars back to the factory. This action comes after discovering a potential hazard where a piece of trim might come off the car. The specific part in question is the A-pillar trim between the front windows and the windshield.

It seems some clips that should hold this trim in place aren’t doing their job right. This is due to some issues during assembly or repair, says Ford. If you are also affected by the Ford Explorer recalls, keep reading because there are some things you need to know!

Ford Explorer recalls
Ford Explorer recalls concern over a million car owners (Image Credit)

Ford Explorer recalls affecting over 1.9 million cars

The seriousness of this issue has been highlighted by the National Highway Traffic Safety Administration (NHTSA). They have expressed concerns that a detached trim could pose a hazard on the roads, increasing the risk of accidents. Fortunately, Ford has not received any reports of accidents or injuries resulting from this problem.

Interestingly, this issue first came to Ford’s attention in 2018. However, they did not consider it a significant safety risk at that time due to the part’s size and shape. The NHTSA began a more thorough investigation into this matter in February of last year. You can find the report here.

Over 14,000 warranty claims have been related to this trim issue, indicating its widespread nature. According to USA Today, Ford’s spokesperson, Maria Buczkowski, has estimated that only about 5% of these vehicles will actually be affected by the recall. She encourages owners to reach out to their local dealerships once the replacement parts are available. Ford has committed to conducting inspections and replacements at no cost to the vehicle owners.

Ford plans to send out notification letters to the owners in March. Additionally, they have provided a customer service number for direct inquiries. To ensure convenience for the customers, Ford is also offering additional services such as mobile repair and pick-up-and-delivery at participating dealerships. Federal safety regulators have announced they will begin sending safety risk notification letters to owners starting March 13, 2024.

Ford Explorer recalls
Ford Explorer recalls to affect the models between 2011-2019 (Image Credit)

Models affected by the Ford Explorer recalls

The recall is quite extensive, covering all Ford Explorer models manufactured between 2011 and 2019. If you own a Ford Explorer from these model years, it’s important to stay informed about this recall.

A Tesla autopilot flaw started a recall that will affect two million vehicles

Ford aims to prevent bigger issues

Through this recall, Ford demonstrates a commitment to the safety and comfort of its customers. While the financial implications for the company are non-trivial, Ford prioritizes the well-being of its customers. By addressing this issue proactively, Ford is taking responsible steps to ensure their vehicles meet safety standards and prevent potential accidents. Well, the authorities asked for it, too, let’s not forget that.

In conclusion, Ford’s recall of over 1.9 million Explorer vehicles over a trim issue underscores the company’s dedication to customer safety. This recall, which affects certain models built between 2011 and 2019, is a significant move by Ford to address potential hazards before they lead to more severe problems.

Vehicle owners are advised to contact their local Ford dealership for inspections and repairs, which Ford assures will be carried out free of charge. As this situation unfolds, it serves as a reminder of the importance of vehicle maintenance and manufacturer accountability in ensuring road safety.

Featured image credit: Haryad Ali/Unsplash

]]>
How Big Data Brought Ford Back from the Brink https://dataconomy.ru/2015/08/24/how-big-data-brought-ford-back-from-the-brink/ https://dataconomy.ru/2015/08/24/how-big-data-brought-ford-back-from-the-brink/#comments Mon, 24 Aug 2015 13:44:33 +0000 https://dataconomy.ru/?p=4358 In 1913, the Ford Motor Company was at the forefront of car manufacture. Designing the reasonably-priced Model T to appeal to the masses and employing division of labour & moblised assembly lines in the factory made Ford the largest automobile factory in the world at that time. In 2007, the Ford Motor Company was in […]]]>

In 1913, the Ford Motor Company was at the forefront of car manufacture. Designing the reasonably-priced Model T to appeal to the masses and employing division of labour & moblised assembly lines in the factory made Ford the largest automobile factory in the world at that time.

In 2007, the Ford Motor Company was in trouble. The end of 2006 financial year brought with it reports of a $12.6 billion loss, the largest in the company’s history. Yet, once again, forward-thinking innovation- this time, in the form of data analytics- led Ford back to the path of prosperity. By 2009, Ford was posting profits for the first time in 4 years, as well as launching 25 new vehicle lines. The same year they sold 2.3 million cars, the only company to exceed the 2 million mark since 2007. Last year, they won the the 2013 INFORMS Prize for Company-Wide Efforts in Analytics & Data Science. So how exactly did Ford harness data analytics to bring them back from the brink and thrust them once again to the forefront of their industry?

Big Data, Supply & Demand

2013 Ford Escape- Ford and Big Data

The 2013 Ford Escape; Image Credit

An essential component of any successful manufacturing business is having your finger on the pulse of what your consumers really want, and an efficient way of supplying it to them. As Ford research scientist Bryan Goodman explains, “Quite a few customers walk into a dealership and want to leave with a vehicle that day… We have to get the right vehicle with the right engine and right set of features and controls to the right dealerships.” Ford’s solution to this is its Smart Inventory Management System, or SIMS. SIMS integrates a range of data, including data on what is being built and sold; data on what’s being sold relative to what other models are in the inventory; data on what specs are being searched for on the company website; and data on the housing prices and employment rates in dealership locations. By manipulating all of this data, SIMS ensures that dealership stock and car specifications are optimised to consumer demand.

The models produced using SIMS data are fine-tuned to the tiniest detail. “When you get into different roof heights, different interiors, different wheels and so on, we can offer an astronomically large number of combinations,” Goodman states. “Imagine 300 billion and then multiply it by itself again.”

Ford also began mining social media posts to gain greater insight into public opinion of their models. ‘We now use text-mining algorithms to formulate a more complete picture of what consumers want that is not available using traditional market research,’ said Michael Cavaretta, Ford technical leader for predictive analytics and data mining. Analysis of this kind was used to make decisions on last year’s Ford Escape, such as whether it should feature the flip-glass system of the previous model, or a new power liftgate. ‘The picture that emerged was a four-to-one preference for power liftgates that could open and close with the touch of a button,” added Cavaretta. “The result was a design that provided improved customer satisfaction while reducing manufacturing complexity and cost.”

Improving Efficiency on the Factory Floor

Another problem Ford tackled was the cost of producing prototypes with so many different variants. The cost of producing a single prototype can often exceed $250,000, and Ford were using somewhere in the region of 100-200 full-vehicle prototypes for each product development.

Students from Wayne State University noted that these prototypes were often sitting idle, only used to test a relatively small number of variants when they could be used for more. Their solution? The Prototype Optimisation Model, or POM- a programme which computed the optimal set of vehicles which could be used to test the maximum amount of iterations- in short, it computed the minimum number of vehicles required to perform the maximum amount of tests. Its first use on the European Transit vehicle saved Ford an estimated $12 million; if rolled out across the whole company, it could generate an estimated $250 million in savings.

Moving Forward- Performance Data and Alternative Fuel Technologies

Ford Fusion Energi- Ford and Big Data

The Ford Fusion Energi; Image Credit
Looking to the future, the velocity of Ford’s data is accelerating. The Fusion Energi plug-in hybrid, released last year, streams performance data from the car to Ford and generates 25GB of data per hour. Mike Tinskey, director of vehicle electrification and infrastructure, describes the data collected from the vehicles as “small but growing”. He says “We gather data every time the customer plugs in. We know where they’re plugging in, how many gas miles they drove, how many electric miles, how often they plug in and how often they take trips. It’s helping to shape where we go next with products.” One proposed use of this data is to work out ‘peak times’ of energy usage, and charge customers a lower rate if they refrain from plugging in when power demand is high.

Ford is also working with Chalmers University of Technology in Sweden to develop a global energy model, to try and anticipate future energy supply and demand.

“The model looks 100 years into the future and can be used to address what-if questions, such as, ‘If we had a carbon dioxide emission target of x, what would that mean for autos, trains and planes?'” says senior technical leader Tim Wallington, who heads the Ford analytics team focused on sustainability issues.

The research has led to Ford developing vehicles which use a range of new fuel technologies- including hybrids, plug-in hybrids, all-electric vehicles and alternative fuel vehicles- rather than pushing one particular fuel technology for the future.

“We did thousands of scenarios, and the bottom line is that given the uncertainties in future costs and efficiencies, it’s not possible to pick a winner,” Wallington says. “Customers can vote with their money as to which they want and which one wins the future. This is the high-level strategy.”

(Featured image credit: Tim RT)

 


Interested in more content like this? Sign up to our newsletter, and you wont miss a thing!

[mc4wp_form]

 

]]>
https://dataconomy.ru/2015/08/24/how-big-data-brought-ford-back-from-the-brink/feed/ 12
Ford Focuses on Big Data Ambitions with the New Silicon Valley Research Centre https://dataconomy.ru/2015/01/26/ford-focuses-on-big-data-ambitions-with-the-new-silicon-valley-research-centre/ https://dataconomy.ru/2015/01/26/ford-focuses-on-big-data-ambitions-with-the-new-silicon-valley-research-centre/#respond Mon, 26 Jan 2015 12:31:50 +0000 https://dataconomy.ru/?p=11683 Ford have been at the forefront of top-down big data analytics from the get-go. In our interview with Ford’s Chief Data Scientist Mike Cavaretta last year, he alluded to the opening of a research centre, to propel Ford’s big data research to dizzying new heights. Now, this plan has come to fruition- Ford’s Silicon Valley Research Center […]]]>

Ford have been at the forefront of top-down big data analytics from the get-go. In our interview with Ford’s Chief Data Scientist Mike Cavaretta last year, he alluded to the opening of a research centre, to propel Ford’s big data research to dizzying new heights. Now, this plan has come to fruition- Ford’s Silicon Valley Research Center had its grand opening last week. The research center aims to drive innovation in connectivity, mobility, and autonomous vehicles.

Mark Fields, Ford’s President and CEO, hopes the centre will keep Ford at the cutting edge of innovation. “This new research center shows Ford’s commitment to be part of the Silicon Valley innovation ecosystem – anticipating customers’ wants and needs, especially on connectivity, mobility and autonomous vehicles,” Fields stated. “We are working to make these new technologies accessible to everyone, not just luxury customers.”

The announcement outlined that:

  1. Ford and Stanford started an alliance to deliver Fusion Hybrid Autonomous Research Vehicle to university engineers for next phase of testing.
  2. Dragos Maciuca, an experienced Silicon Valley engineer, joins Ford from Apple to serve as senior technical leader at Research and Innovation Center Palo Alto; additional hiring plans will support Ford having one of the largest automotive research teams in Silicon Valley.

This facility is the latest in Ford’s global network of research and innovation centers, including its location in Dearborn, Michigan, which focuses on advanced electronics, human-machine interface, materials science, big data and analytics; and Aachen, Germany, which focuses on next-generation powertrain research, driver-assist technologies and active safety systems, reports their press release.

Situated in the Stanford Research Park, the facility will accommodate 125 researchers, engineers and scientists.

Some of its projects in key areas, include:

  1. Connectivity : Ford is integrating with the Nest application programming interface, targeting home energy and emergency system management while on the road through a series of research experiments.
  2. Mobility: As the next phase in Ford’s Remote Repositioning mobility experiment, the Palo Alto team is now testing the ability to drive vehicles located on Georgia Institute of Technology’s campus in Atlanta from the Bay Area to prove out the new technology.
  3. Autonomous vehicles: While Ford’s research and development in autonomous vehicles is a global effort, including ongoing work with University of Michigan and Massachusetts Institute of Technology, the Palo Alto team will expand collaboration with Stanford University that kicked off in 2013.
  4. Customer experience
  5. Big data and analytics: Ford is leveraging its OpenXC platform to help learn how customers are using their vehicles, and is conducting analytics to detect patterns and learnings that can lead to product improvements or new mobility services.

The opening of this centre marks the beginning of an exciting new chapter in Ford’s development, as well as the development of the automotive industry as a whole.


(Image credit: Marco Ely, via Flickr)

]]>
https://dataconomy.ru/2015/01/26/ford-focuses-on-big-data-ambitions-with-the-new-silicon-valley-research-centre/feed/ 0
Ford Reveal Strategy for Smart and Autonomous Transportation at CES https://dataconomy.ru/2015/01/07/ford-reveal-strategy-for-smart-and-autonomous-transportation-at-ces/ https://dataconomy.ru/2015/01/07/ford-reveal-strategy-for-smart-and-autonomous-transportation-at-ces/#respond Wed, 07 Jan 2015 11:13:55 +0000 https://dataconomy.ru/?p=11276 The 2015 Consumer Electronics Show at Las Vegas saw automaker Ford reveal its ‘Smart Mobility Plan and 25 Global Experiments’ designed as a move to change mobility worldwide. “Even as we showcase connected cars and share our plans for autonomous vehicles, we are here at CES with a higher purpose,” said Ford President and CEO […]]]>

The 2015 Consumer Electronics Show at Las Vegas saw automaker Ford reveal its ‘Smart Mobility Plan and 25 Global Experiments’ designed as a move to change mobility worldwide.

“Even as we showcase connected cars and share our plans for autonomous vehicles, we are here at CES with a higher purpose,” said Ford President and CEO Mark Fields. “We are driving innovation in every part of our business to be both a product and mobility company – and, ultimately, to change the way the world moves just as our founder Henry Ford did 111 years ago.”

The U.S. multinational announced its strategy as follows:

  1. Ford Smart Mobility plan to use innovation to take company to next level in connectivity, mobility, autonomous vehicles, the customer experience and big data
  2. 25 global mobility experiments launched this year to test new ideas and address growing or increasing transportation challenges; insights gained will shape Ford’s future investments
  3. Ford showcases SYNC® 3, its most advanced vehicle connectivity system, and highlights semi-autonomous vehicles on the road today and fully autonomous vehicles in development

“We see a world where vehicles talk to one another, drivers and vehicles communicate with the city infrastructure to relieve congestion, and people routinely share vehicles or multiple forms of transportation for their daily commute,” Fields explained. The experiments would lead to a new generation of transportation and mobility in the coming decade, according to a news release. Aiding in this will be SYNC 3, the company’s new communications and entertainment system that is faster, more intuitive and easier to use with enhanced response to driver commands.

Read more here.


(Image credit: Quan Nguyen, via Flickr)

]]>
https://dataconomy.ru/2015/01/07/ford-reveal-strategy-for-smart-and-autonomous-transportation-at-ces/feed/ 0
Top 5 Big Data News & Articles of 2014 https://dataconomy.ru/2014/12/30/top-5-big-data-news-articles-of-2014/ https://dataconomy.ru/2014/12/30/top-5-big-data-news-articles-of-2014/#respond Tue, 30 Dec 2014 14:07:23 +0000 https://dataconomy.ru/?p=11075 2014 Has been a massive year for Big Data, as it transitioned from an emerging and exciting technology to a somewhat overused buzzword for any application of data science. This was reflected earlier in the year by Gartner’s always insightful ‘hype cycle‘, where we saw the term drop off the ‘Peak of Inflated Expectations’, and begin its […]]]>

2014 Has been a massive year for Big Data, as it transitioned from an emerging and exciting technology to a somewhat overused buzzword for any application of data science. This was reflected earlier in the year by Gartner’s always insightful ‘hype cycle‘, where we saw the term drop off the ‘Peak of Inflated Expectations’, and begin its descent into the ‘Trough of Disillusionment’.

While companies all over the world have scrambled to add some data science magic to their mix, an increasingly significant knowledge gap developed between the old school business minds and the new data driven youngsters. Lessons that Eric Ries started teaching the startup world in 2011 about validating ideas and quick iteration with evidence based decisions became appropriate across the board. A downpour of new tracking, analytics and visualisation tools has given businesses the opportunity to bring the build-measure-learn cycle into the very heart of their operations, regardless of size or scale.

Let’s take this opportunity to see which were the most exciting Big Data news stories and articles from 2014, based on the amount of engagement across Facebook, Twitter, LinkedIn and Google+:

Top 5 Big Data News & Articles of 20145. How Ford Uses Data Science: Past, Present and Future

Earlier in the year Eileen McNulty wrote a case study on Ford’s use of Big Data to turn their business around from a $12.6 billion loss in 2006. In November she followed up with Ford’s Chief Data Scientist, Mike Cavaretta, to find out more about the technology and some of their learnings.

Top 5 Big Data News & Articles of 20144. Indian Government Using Big Data to Revolutionise Democracy

Another follow up post, this time to Furhaad Shah’s article on Narendra Modi’s use of Big Data in the 2014 election in India. Eileen McNulty looked at how Modi’s party was continuing to build their big data strategy into the governance of the country, helping to provide better representation for India’s immense population.

Top 5 Big Data News & Articles of 20143. The Data Science Skills Network

Throughout 2014 we were very lucky to have Ferris Jumah, one of LinkedIn’s immensely talented data scientist, give talks at our events around Europe. He also shared with us some of his personal research into exactly what makes up the skill set of a data scientist. A fascinating read for anyone with a professional interest in the field.

Top 5 Big Data News & Articles of 20142. Kreditech Raises $40 Million at $190 Million Valuation

Fintech came kicking from the gate in 2014, with automated credit scoring company Kreditech raising $40 million at an incredible $190 million valuation. It was a record for the German fintech scene, and a strong indication of a very promising growth area for the country. Of all the exciting funding stories this year, this one spread the fastest.

Top 5 Big Data News & Articles of 20141. 60 Seconds With Mark Cuban: Cyber Dust and Data Security

Tech mogul Mark Cuban is now well known for his super secure messaging app Cyber Dust. He spent a little time talking to us about the other potential for Cyber Dust: Licensing out the technology into the Internet of Things market, to keep your notifications and sensor data private and secure. It’s a fascinating move, and we’re excited to see where it gets picked up.

 


(image credit: Eric Fischer)

]]>
https://dataconomy.ru/2014/12/30/top-5-big-data-news-articles-of-2014/feed/ 0
10 Big Data Stories You Shouldn’t Miss this Week https://dataconomy.ru/2014/11/21/10-big-data-stories-you-shouldnt-miss-this-week-5/ https://dataconomy.ru/2014/11/21/10-big-data-stories-you-shouldnt-miss-this-week-5/#respond Fri, 21 Nov 2014 14:25:54 +0000 https://dataconomy.ru/?p=10529 “Some people worry that artificial intelligence will make us feel inferior, but then, anybody in his right mind should have an inferiority complex every time he looks at a flower.” There has been a lot of media attention on Artificial Intelligence this week, with leading influencers taking various positions on the future of its development. […]]]>

“Some people worry that artificial intelligence will make us feel inferior, but then, anybody in his right mind should have an inferiority complex every time he looks at a flower.”

There has been a lot of media attention on Artificial Intelligence this week, with leading influencers taking various positions on the future of its development. Elon Musk, on the one hand, commented on an article published by Edge that AI poses a real existential threat to humanity.  On the contrary, Steve Ballmer, the Ex-CEO of Microsoft, argued that Artificial Intelligence and Machine Learning are the next frontier of Computer Science. It was also revealed The School of Engineering and Applied Sciences at Harvard is set to benefit from an undisclosed donation from Ballmer – reaffirming his belief that Computer Science still required the research and monetary commitment to realise it’s potential. One thing is for sure, however, whether you’re worried about AI or not, its development will continue to see investment by companies across the world- just take a look at Tech giants like Facebook and Google.

Aside from AI, there has been some exciting news and articles in other parts of Data Science too. Below you’ll find our selection of our favourite piece. We hope you enjoy!

TOP DATACONOMY ARTICLES

8297483344_6b63cdfa60_hHow Ford Uses Data Science: Past, Present and Future

Success stories of how data-driven practices can revitalise businesses are rife today, but there are few as compelling as the story of Ford. In 2006, the legendary car manufacturers were in trouble; they closed the year with a $12.6 billion loss, the largest in the company’s history. As we reported earlier in the year, through implementing a top-down data-driven culture and using innovative data science techniques, Ford was able to start turning profits again in just three years. I was recently lucky enough to speak with Mike Cavaretta, Ford’s Chief Data Scientist, who divulged the inside story of how data saved one of the world’s largest automobile manufacturers, and well as discussing how Ford will use data in the future.

SmashedA Team of Researchers Has Found a Way to Predict the Stock Market Using Search Terms

Weeks before the release of their paper, “Quantifying the semantics of search behavior before stock market moves”, Dataconomy met with Dr. Suzy Moat and Dr. Tobias Preis to discuss their research on predicting the stock market by analyzing Google and Wikipedia searches. The initial two studies — which asked the question “Is there a relationship between what people are looking for on Google and Wikipedia and subsequent stock market moves?” — was released in April 2013, and received considerable media attention. Now, the two researchers, along with H. Eugene Stanley and Chester Curme, have come out with a follow-up study that seeks to look at their original findings from a different angle — essentially, which particular topics (Politics, Food, Sports) might have a relationship to stock market moves?

Measuring the Mobile App EconomyMeasuring the Mobile App Economy

If there was ever a fascinating time to be immersed in the app economy, it is now. After experiencing a huge and flourishing boom, the industry now faces a period of change, maturity, or decline, depending on who you’re talking to. Berlin-based Priori Data are in a unique position to comment upon the changing app landscape; their platform offers users the ability to track and benchmark over 2.5 million apps. We recently sat down with Director of Data Science Michelle Tran and Director of Content Natasha Yarotskaya about what insights into the market their data has uncovered, as well as what to expect from the latest version of their platform, which is released today.

TOP DATACONOMY NEWS

HortonworksData Scientists Tackle High-Impact Social Problems at Bayes Impact Hackathon                                                                                                

At a hackathon hosted by Bayes Impact, data scientists pitted against each other to develop data driven and implementable software solutions for high impact social problems. The winning hack will become a Bayes Impact project, staffed with full-time data scientists and engineers who will work with their partners to bring the hack to life.

Steve Ballmer Advocates Machine Learning as the Next Era of Computer ScienceSteve Ballmer Advocates Machine Learning as the Next Era of Computer Science

Steve Ballmer, former Microsoft CEO and Harvard alumnus; who announced a significant donation to the Computer Science Department at Harvard last week is advocating machine learning as the next era of computer science. Ballmer expressed his excitement about the ability of computer and IT to process huge amounts of data not only to see patterns but to suggest actions and understand human intent.

10 Big Data Stories You Shouldn’t Miss this WeekTeradata and MapR Partnership Expands Hadoop Choices within Teradata’s Unified Data Architecture

Teradata, the big data analytics and marketing applications company, and MapR Technologies, Inc., a provider of Apache™ Hadoop, today announced an expanded partnership that covers technology integration, road map alignment, and a unified go-to-market offering. The two companies are optimizing integration of the MapR Distribution within the Teradata® Unified Data Architecture™, providing customers more choices when combining Teradata’s best-in-class data warehousing with Hadoop, discovery platforms, and NoSQL options to drive analytic innovation.

TOP UPCOMING EVENTS

Hortonworks28-30 November, 2014- 5th International Conference on the Digital Home, Guangzhou, China     

The development of the digital home technology is a trend of information penetration and fusion in life. Numerous digital media equipments and contents have been applied into home and life. The growth of media contents and broadband-width stimulates the problems of producing, editing, transmitting, and displaying of media contents. It brings enormous opportunities as well as challenges for the researchers and application developers.

26 November, 2014- Big Data eXchange, London26 November, 2014- Big Data eXchange, London

We’re creating and collecting more data each year than ever before, but how do we derive value from it? Few of us have first-hand experience of mining insight out of our data, and yet increasingly our markets demand such insight, and often in real-time. There are practitioners out there in technology and allied fields who have been data-centric for years and they have the skills and insights in how to adopt modern big data technology and use it to great effect. Those are the people we now look at to help bridge the looming gap between data and insight.

TOP DATACONOMY JOBS

HortonworksPricing Manager / Analyst, Wayfair   

As an Analyst of Pricing you will be responsible for pricing every product that appears on the website. You will manage the daily operational pricing functions while continually seeking to optimize procedures and test strategies to increase gross profit. If you love diving into deep data sets to identify areas for improvement, and be even more enthusiastic about solving those problems then do not hesitate to apply!

26 November, 2014- Big Data eXchange, LondonDatabase Developer NoSQL (Hadoop/Big Data), GameDuell

Work in Berlin for one of the world’s largest games websites and excite more than125 million users. As a Database Developer at GameDuell, you will be responsible for our Hadoop/Big Data infrastructure and you will serve as an interface between different teams.

]]>
https://dataconomy.ru/2014/11/21/10-big-data-stories-you-shouldnt-miss-this-week-5/feed/ 0
How Ford Uses Data Science: Past, Present and Future https://dataconomy.ru/2014/11/18/how-ford-uses-data-science-past-present-and-future/ https://dataconomy.ru/2014/11/18/how-ford-uses-data-science-past-present-and-future/#comments Tue, 18 Nov 2014 11:36:55 +0000 https://dataconomy.ru/?p=10439 Success stories of how data-driven practices can revitalise businesses are rife today, but there are few as compelling as the story of Ford. In 2006, the legendary car manufacturers were in trouble; they closed the year with a $12.6 billion loss, the largest in the company’s history. As we reported earlier in the year, through […]]]>

Mike-CavarettaSuccess stories of how data-driven practices can revitalise businesses are rife today, but there are few as compelling as the story of Ford. In 2006, the legendary car manufacturers were in trouble; they closed the year with a $12.6 billion loss, the largest in the company’s history. As we reported earlier in the year, through implementing a top-down data-driven culture and using innovative data science techniques, Ford was able to start turning profits again in just three years. I was recently lucky enough to speak with Mike Cavaretta, Ford’s Chief Data Scientist, who divulged the inside story of how data saved one of the world’s largest automobile manufacturers, and well as discussing how Ford will use data in the future.


As an overview, how do Ford use data science?
So at the moment, we’re primarily trying to break down our data silos. We have a number of projects that are using Hadoop, and we’re actually setting up our Big Data Analytics Lab, where we can run our own experiments and have a look at some of the more research questions.

Back in 2006/07, Ford was having a downturn. Since then, it’s dramatically turned things around. What role did data science play in this?

Thanks for that question, and thanks so much for phrasing it as “data science” and not “big data”. I think at this point in time, “big data” has come to mean so many things to so many people, I think it’s better to focus on the analytical techniques, and I think data science does a pretty good job of narrowing in on that.

So back to 2006-2007- that was around the time Alan Mulally was brought on. He brought with him this idea that important decisions within the company had to be based on data. He forged that from the very beginning, and from the top down. It really didn’t take a long time for people to realize that if the new CEO is asking, “Hey where is the data you are basing your decision on?”, you’d better go out and find the data, and have a good reason why that data matters to this particular decision.

So, it became apparent quickly that we needed people who could manipulate the data. We didn’t call it “data science” at the time, but being able to bring data to bear against different problems became of primary importance.

The idea was that the roadmap really needed to be based on the best data that we had at that time, and the focus was not only good data and analysis, but also being able to react to that analysis fast.

So an 80% solution would allow us to move quickly, and benefit the business more than a 95% solution where we missed the decision point. I think there were a lot of benefits to being able to bring these methods, ideas and data-driven decisions using good statistical techniques. This approach helps to build your credibility, as you’re able to bring great results with good timing- it just worked out well.

What technologies were you using?

At the time, the primary technologies we were using were really on more on the statistical side, so none of the big data stuff- we were not using Hadoop. The primary database technologies were SQL-driven. Ford has a mix of a lot of different technologies from a lot of different companies- Microsoft, Teradata, Oracle… The database technologies allowed us to go to our IT partners and say “This is the data that is important, we need to be able to make a decision based on this analysis”- and we could do it. On the statistical side, we did a lot of stuff in R. We did some stuff with SAS. But it was much more focused on the statistical analysis stuff.

What technologies have you since added?

So I think the biggest change from our perspective is a recognition that the visualization tools have got much better. We are big fans of Tableau and big fans of Qlikview, and those are the two primary ones we use at Ford.

We’ve done a lot more with R and we’re currently evaluating Pentaho. So we’ve really moved from more point solutions for solving particular problems, to more of a framework and understanding different needs in different areas. For example, there may be certain times when SAS is great for analysis because we already have implementations, and it’s easier to get that into production. There are other times when R is a better choice because it’s got certain packages that makes that analysis a lot easier, so we’re working on trying to put all that together.

Ford Big Data Science Mike Caveretta

You’ve now begun to collect data from the cars themselves- what insights has this yielded?

So there’s a good amount of analytics that can be done on the data we collect. It’s all opt-in data- it’s all data that the customers have agreed to share with us. Primarily, they opt-in to find charging stations, and to better understand how their electric vehicles are working. A lot of the stuff we are looking at has to do with how people are using their vehicles, and how to make sure that the features are working correctly.

Ford Big Data Science Mike Caveretta

Ford use text mining and sentiment analysis to gauge public opinion of certain features and models; tell us more about that.

So a lot of the work that we’ve done to support the development of different features, and to figure out what feature should go on certain vehicles, is based on what we call very targeted social media. Our internal marketing customers will come to us and ask us, “We’re thinking about using this particular feature, and putting it on a vehicle”- the power liftgate of the Ford Escape is a good example, the three-blink turn signal on the Ford Fiesta is another one. In those circumstances, we will take a look at what most people think about the features on similar vehicles. What are they saying about what they would like to see? But we don’t pull in terabytes of Twitter and we don’t use Facebook- we go to other sources that we found to be good indicators what customers like. It’s not shotgun blasts, so to speak; it’s more like very specific rifle shots. This gives us not only quantitative understanding- this customer likes it and this customer doesn’t- but also stories that we can put against it. And these stories are usually when the customers are talking with each other. One great story is for the three-blink turn signal when one customer was describing, “So I got the vehicle. I got the three-blink turn signal and I’m not sure whether I like it or not.” And other people were chiming in saying “You know what, I kind of got the same impression, give it another couple of weeks and just think about how you’re using it on the highway and if you give it a couple of weeks you’ll like it.”

The first person signed back on a few days later and said “You know you what, you were right, now that I understand how it works and where it should be used- I think I like it now!” It was actually kind of beautiful, and that story we can put in front of people and say “This is the way people are using it, these are the some of things they’re talking about”. So now, we’re not only getting the numbers, but also the story behind it. Which I think is very important.

What can we expect from Ford in the future?

I think the position that we’re in right now is really looking at instantiating the experiments we want to do in the analytics space, linking up the different analytics groups, and really focusing on the way that big data technologies allow us to break down data silos.

This company’s been around for over 100 years, and there’s data in different areas that we’ve used for different purposes. So we’ll start looking at that- and start providing value across the different spaces. We’ve put some good effort into that space and got some good traction on it. I can see that as an area that’s going to grow in volume and in importance in the future.


(Featured Image Credit: Hèctor Clivillé)

]]>
https://dataconomy.ru/2014/11/18/how-ford-uses-data-science-past-present-and-future/feed/ 8
Ford builds better cars through Big Data https://dataconomy.ru/2014/03/03/ford-builds-better-cars-big-data-4/ https://dataconomy.ru/2014/03/03/ford-builds-better-cars-big-data-4/#respond Mon, 03 Mar 2014 10:30:42 +0000 https://dataconomy.ru/?post_type=news&p=971 Using Big Data, Ford builds better cars. Car sensors, interactions within the company and the customers themselves are all generating Data that Ford can use in order to build better cars. The Leader of the data science department, Michael Cavaretta, interviewed with GigaOm’s Structure Show podcast in order to explain the role big data plays […]]]>

Using Big Data, Ford builds better cars. Car sensors, interactions within the company and the customers themselves are all generating Data that Ford can use in order to build better cars. The Leader of the data science department, Michael Cavaretta, interviewed with GigaOm’s Structure Show podcast in order to explain the role big data plays within his company.

ford-fiesta-hatchback-turn-indicator-outside-rear-view-mirror-093For example, Ford Fiestas tend to blink three times in short succession when switching lanes. Through analysing social media references, Ford realised that, while popular in Europe, this 3-blink turn signal was not as popular in the US, or so it seemed, Actually, the data showed that it was less the signal itself – that was popular – and more the positioning of the switch. The general feature was accepted.

Read more at GigaOm or listen to the podcast.

 

Image Credits: Car Dekho
]]>
https://dataconomy.ru/2014/03/03/ford-builds-better-cars-big-data-4/feed/ 0