When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.
Toyota, Intel, others to form auto big-data consortium
Toyota, chipmaker Intel and other technology and auto companies are forming a consortium to create an ecosystem for big data used in connected cars, the Japanese automaker said on Thursday.
Swiss telecom equipment maker Ericsson, Japanese auto parts maker Denso Corp and telecoms firm NTT DoCoMo are also part of the group, called the Automotive Edge Computing Consortium.
The consortium aims to use data to support emerging services such as intelligent driving, creating maps with real-time data and driving assistance based on cloud computing, Toyota said in a statement.
For more on the alliances and competition in the crowded field of autonomous driving development, see this interactive graphic and this story.
As cars are equipped with new capabilities, from staying in lanes to driving themselves, they are using and producing vast amounts of information, including where they drive.
Data volume between vehicles and the cloud is expected to reach 10 exabytes per month around 2025, about 10,000 times larger than at present, Toyota said.
^^ uh don't think so. 10 exabytes is 10^19. a few gigs is 3x10^9 or let's say 1/3x10^10. so you'd be talking a compression of about 3x10^9:1 or 3 billion to 1. i'd like to know what lossless compression could achieve that.
^^ uh don't think so. 10 exabytes is 10^19. a few gigs is 3x10^9 or let's say 1/3x10^10. so you'd be talking a compression of about 3x10^9:1 or 3 billion to 1. i'd like to know what lossless compression could achieve that.
Only if you believe in their 10 exabytes of marketing bull crap.
I'll give you an example...I went out for a photo shoot over the weekend and came back with over 1 terabytes of photos sure I can brag about the sheer amount of data storage I consumed by using all kinds of sensual jargon but then after I toss out the garbage, transfer the good stuff to jpeg, huh, turns out it took barely a few gigs. Then I look at the folder that I optimize for public viewing with a reasonable resolution...a few hundred megs at most. So from over a terabyte to a few hundred megs what are the compression ratios? I'll let you do the math. By the way this can be further exaggerated if I was shooting in video.
^^ uh don't think so. 10 exabytes is 10^19. a few gigs is 3x10^9 or let's say 1/3x10^10. so you'd be talking a compression of about 3x10^9:1 or 3 billion to 1. i'd like to know what lossless compression could achieve that.