I think you may be confusing Android Auto with Android for Cars. The former is the mobile app and headunit-only integration (audio controls) via video stream, which requires minimal integration with the vehicle. The other is Android repurposed for headunits, with full car integration, such as Volvo's integration.
Car manufacturers of luxury cars shy away from Android Auto partially because they believe it turns one of their main selling points and silos (advanced technology) into yet another commodity.
Thx for the definitions - useful/interesting (but I did not doublecheck them, hehe) :)
> which requires minimal integration with the vehicle
My guess: maybe VW is scared of a potential "which CURRENTLY requires minimal integration with the vehicle" (I imagine that the push for more and more integration will increase in the future) and therefore decided to give it a try now and not to become passively dependent on Google.
Because Audi views themselves as a performance brand over a luxury brand. They're also hungrier and thus more responsive to consumer demand.
BMW allows carplay with a monthly subscription to the rest of their services, largely because they believe that as long as you're getting the full benefit of a connected car, you'll be attached to the value - strategically, it makes no sense for them to give it away from free.
The other thing? These companies are starting to change their mind as consumers demand it as a feature.
I always knew there were good reasons I prefer Audi.
Although to be honest, if not for the unconscionable amount of money they (or Navteq, I guess) asks for map updates, I'd be even happier just using the built-in navigation all the time.
If a Tesla car was driving around a parking lot in the wrong direction it definitely looks like some of those listed e.g. careless, reckless or improper driving would cover it.
So yes there is a good case to be made that it's illegal.
>We need to break their grip on the desktop/server/laptop market.
I'm curious as to why you think this "needs" to happen?
I have my own reasons, for example, openness of ISA and IP from up top at the browser all the way down to the firmware on the chip; however, I'm curious to hear yours.
- A monoculture of HW leaves us more susceptible to security issues. See Spectre and Meltdown. While not eradicated by other hardware designs, the problem is generally mitigates. See relative difference in what Spectre and Meltdown affected as evidence.
- A monoculture of HW may lead us to local maxima of capability. As we approach theoretical maximums of the hardware in specific materials areas, we are presented with more and more complex architectures to eek out smaller and smaller gains. This makes the pursuit of alternative architectures more and more costly to invest in from a business standpoint, knowing that it may be years or decades before it starts to compete favorably with existing products. The more concurrent alternate architectures that can be pursued at the same time will allow for a much shorter time to market for alternatives, should a specific niche fit their capabilities well. (e.g. it's taken a decade, but Apple is making ARM competitive to the point it may be a viable laptop processor competitor, possibly even the better option at some point).
- Multiple avenues of research often yield benefits of cross-pollination. CISC/RISC in the past ended up converging somewhat in the middle as they adopted the best features of each other. The most lines of research we have (and used in the market, so they get sufficient funding and attention) the more likely we are to see benefits for all related hardware.
A slightly different framing than openness that is used from time to time is owner controllable. Older x86 systems were underdocumented, but at least there was the hope of reverse engineering to replace boot firmware, limit device firmware and develop libre device drivers. Not so on today's OEM systems, the CPUs are coming cryptographically fused to only accept the OEMs boot firmware and with locked down, privileged, security co-processors.
I don't want to live in a world where everything new is an iThing that says I can't help you Dave. That could turn retro-computing from a hobby to an essential life skill to preserve freedom!
> I'm curious as to why you think this "needs" to happen?
I think Intel has been too dominant for too long. They've gotten complacent and sloppy, as shown by microcode-level vulnerabilities like Spectre and Meltdown.
Also, I figure breaking Intel's dominance will further marginalize Microsoft. I still hold a grudge over how horrible an experience Windows 98 was.
> breaking Intel's dominance will further marginalize Microsoft
Microsoft is not stupid, Microsoft has ported everything to ARMv8 already, both for the Snapdragon laptops and the servers (which they only use internally for now, but I hope Amazon's Graviton is pushing them in the ARM-public-cloud direction)
Fwiw, the NT kernel was designed for portability from the ground up. During initial development they targeted x86 and the i680 and it has been ported to pretty much any CPU that had relevance over time. Itanium, Alpha, ARM and.... PowerPC. While that branch is probably not maintained currently it would be pretty easy for Microsoft to get that going again.
If you want to run Linux in the cloud, you've got a bunch of providers (Google, Amazon, Microsoft, etc)
Microsoft simply wants to be in that market, so they'll offer the same products their competition does. Anything they can do to make Linux better on Azure is considered a net gain for them
I've been a vegetarian for a few years. Part of the impetus for my choice was the lack of regulation and health code standards in this country.
Recent news that the federal government is reducing inspection of meat really does scare me. How many people need to die (like in the early 1900s) before we re-realize the importance of these regulations on public life? When will the next Upton Sinclair come along?
Unfortunately even fruit and vegetables aren't safe.
Syphilis and TB antibiotics are being used to combat disease on orange trees in Florida:
The E.P.A. has proposed allowing as much as 650,000 pounds of streptomycin to be sprayed on citrus crops each year. By comparison, Americans annually use 14,000 pounds of aminoglycosides, the class of antibiotics that includes streptomycin
In its decision to approve two drugs for orange and grapefruit trees, the E.P.A. largely ignored objections from the C.D.C. and the F.D.A., which fear that expanding their use in cash crops could fuel antibiotic resistance in humans.
scientists are especially worried that the drugs will cause pathogenic bacteria in the soil to become resistant to the compounds and then find their way to people through groundwater or contaminated food.[1]
Several lawmakers wrote to the EPA last month to urge a rethink of the policy given legitimate scientific concerns around the issue of antimicrobial resistance. To quote the letter:
Antibiotics are life-saving medicines and, except in extraordinary circumstances, should only be used to treat specific illnesses in people and animals," the lawmakers wrote. "EPA's assessments appear to ignore scientific evidence, violate the principle of judicious antibiotic use, and could create unnecessary harm to human health by authorizing an unprecedented amount of medically important antibiotics to be used for plant agriculture.[2]
The most terrifying thing about antibiotic resistance is that bacterial will readily take up DNA fragments from dead neighbors (or via sex pillus).
This means that if a benign bacteria develops resistance, that gene can easily find its way into a pathogenic one if they are both in the same environment.
Yes. And people sought to address the specific causes of tubercular beef while leaving the rest untouched because they weren't cared about the exploitation and abuse of laborers as much as their own health.
I mean the same thing happened to Orwell. He was a democratic socialist who literally went to war with the fascists in Spain but people think Animal Farm is meant to be a rebuke of communism generally (and not Stalinism specifically).
Its simple to get to know your local rancher. You can even voice your concerns and have them heard by the person raising your beef, while walking the ranch on a nkce day. Factory farms are the bigger problem, not just meat per-se.
This isn't practical because it requires so much effort from the consumer. It's a lot of time and coordination and it's relatively expensive. Farmers markets are one opportunity but high quality organic ranchers are breaking their backs processing meat and driving it to all the farmers markets, and are barely turning a profit off $16/lb meat.
I'm working in this space for a tiny company that delivers high quality, low antibiotic use, low environmental impact, pasture raised single-source meat products in California. We're going to run farm education "meet the rancher" events to get people out to where the meat is actually produced, but we have to be realistic: that's more for the instagram photo ops to convince people that we're legitimate instead of to truly change consumer behavior.
The ugly truth is Americans eat too much meat, and we expect it to be dirt cheap. Meat has been heavily subsidized, and all the externalities from industrial meat production have been ignored, for the past 60 years at least. Turning back the clock on those consumer expectations is going to be very painful.
I'm also extremely worried about the rise of industrial meat production in the developing world, but it seems elitist of wealthy Americans to complain about Chinese agriculture standards when we can't even fix it in our own backyard.
>Factory farms are the bigger problem, not just meat per-se.
Not just meat farms! One thing vegans and vegetarians often gloss over is the death per calorie involved in crop fields. Those thousands of acres have animals in them, and the machines sure dont stop for every varmit that gets in the way.
It is not clear what the right answer is (assuming all you care about is number of animal lives, which is an inadequate measure anyway). The best advice i know of is to only eat well raised cow meat because 1 cow provides a lot of calories, and vegetables, chickens, and pigs etc do not and kill lots of animals.
It takes significantly more feed to raise a cow. The calories extracted from cow meat is extremely inefficient considering the high number of calories in (feed) and water. So, you could not use feed (vegetable death-per-calorie!) because it's something like a 8:1 calorie ratio for feed to a pound of beef. You'd have to pasture raise the cows.
There is no feasible way to ensure that all food is pasture-grazing cow meat at a price point accessible to most Americans.
There is currently no cruelty free-way to have cheap, accessible calories. If you really wanted the lowest death-per-calorie, one of the most feasible approaches would be consuming large amounts of rice.
I don't think anyone would want to subsist exclusively on a diet of rice, beans, and dietary supplements.
> There is no feasible way to ensure that all food is pasture-grazing cow meat at a price point accessible to most Americans.
Maybe Americans (westerners in general actually) should simply eat less? 3 oz of meat per day is enough (5 oz of protein is recommended, but part of that can easily be filled by grains, legumes, cheese, milk, etc...). No one needs 10 oz of meat per day (current average consumption for Americans).
> I don't think anyone would want to subsist exclusively on a diet of rice, beans, and dietary supplements.
I think this is a part of the problem in dominantly meat-eating cultures: they don't realize how diverse and nutritious plant life is. Treating meat and perhaps milk/eggs as real food and the rest as mostly fillers and in-betweens (I know I did).
Even without meat/milk/eggs/animal products, rice and beans, even barring all legumes and cereal grains, probably even barring all grains (e.g. amaranth, buckwheat, chia, quinoa, also sunflower, poppy, hemp..) there would still be enough plants life to have a diverse diet and probably even fully nutritiously sufficient diet without artificial supplements (a couple of micronutrients can be tricky, e.g. B₁₂).
Many, perhaps even most of the remaining diverse crops are currently notably more work to grow, but they should still have significantly less ecological impact than eating excessive amounts of meat.
If a cow is not raised in pasture, it's not a well raised cow.
Ruminant holon systems like those used by Polyface Farms (Joel Salatin of Omnivore Dilemma fame) are incredibly efficient at both producing calories and utilizing waste products.
What do you propose? That we start eating crickets? Actually I would try some, maybe they taste like shrimp. Shrimp are another quickly reproducing species, but they're also sensitive to pollution.
I've had a dish of lemon grass stir-fried cricket. It was at a restaurant that served lots of bug based dishes. It was one of the few that we had that I'd regularly order.
They didn't taste like shrimp though. They also have a lot of chitin.
Those crop fields are also harvested to feed farm animals. Eating meat requires more crops (and thus acreage) per calorie to the human consumer than eating the crops directly.
Eating meat will always have a higher death per calorie. An informed vegan knows that there is no such thing as a cruelty-free diet, only one that minimizes it.
This is not an issue of "staying woke". Arguing that statutory rape is "absurd" is not "staying woke". Stallman said some things that were deeply troubling to a lot of people, myself included. That aside...
His judgement in making these statements, in a public way, was extremely poor. As a representative of FSF and MIT, he stepped far out of the boundary of what's considered reasonable. Leadership means knowing how to lead, and that includes not demoralizing your organization with controversy.
He didn't step down. He was fired with the option of keeping his dignity.
It's in fact morally absurd to have it be statutory rape in one jurisdiction and not in another, depending on age. That's a fact, if you don't like it you can kindly go bury you head in the woke sand.
It's "morally absurd" for different jurisdictions to self-govern?
Or for them to disagree with each other?
What universal age do you think everyone in all jurisdictions should be forced to accept, and how do you intend to enforce your moral superiority on them?
Taken logically some jurisdictions have legislation and legislators that allows for and perhaps it could be argued support rape according to other jurisdictions. Given this is the case how does such a rape supporting jurisdiction have any legal authority whatsoever to for instance compel another jurisdiction to turn over people guilty of other crimes say murder? And vice versa, according to the rape supporting jurisdiction how can one which supports imprisoning innocent people for things which are not criminal have any legal relationship to the other?
There's no real age where you can put a cutoff. The prosecution though is allowed no discretion.
Judge is neither.
Juries are supposed to use the very harsh letter of the law, no exceptions.
These laws are bad and should be reformulated to handle real life, which has corner cases (e.g. interstate romance between minors, what about it?) and where prosecution could cause more damage than it helps.
You just completely ignored the roles of Attorneys General, District Attorneys, Governors who can pardon, Jury Nullification, judges who are not constrained by mandatory sentences... Probably more.
People deserve to have a clear law.
Everyone always hates thresholds. That's just a fact of life.
These laws are good. If you can't wait for someone to be legal, then you shouldn't be having sex with them in the first place. Find a different partner.
> prosecution could cause more damage than it helps
How about no. It's already incredibly difficult to convict someone of rape, because consent comes down to he-said / she-said. Keeping statutory rape on the books makes it much easier to convict people who went after children, at least.
PS, I am not a lawyer, or a child psychologist, and I highly doubt you are, either.
It always seems like incumbents have it all (resources, distribution, talent, etc) until they don't. For ex, the "iPhone" theoretically should have come from Nokia by that logic. AR particularly requires truly novel innovations for it to work in mass-market. There might be niche-cases that work for AR (studio/arcade games, sports tech) before the mass market use-cases.
Apple is well documented (for it being a secret project) as throwing tons of resources into compelling AR products.
I'm not saying they're going to succeed, but they're certainly trying - unlike Nokia, who was in the middle of a massive internal war over Symbian when the first iPhone hit.
Apple is (or used to be) very good at waiting for the right time when all necessary components for a good product are ready or close to ready. There is a lot of value in watching other companies fail and learn from them.
Apple has ARKit on iPhones, and it's usable right now.
Apple has the built-in "measure" app, though that can only measure things on a tabletop, and the Mac Pro table-top demo (https://www.apple.com/mac-pro/ in Safari on iOS). The iOS game Egg, Inc. has a mode where you can see your farm projected on top of a nearby surface. It's kind of neat to see the little chickens run around on my kitchen table.
What's holding AR back (on the iPhone) then, is a killer app.
Vuforia Chalk is one for remote support. See what the supportee sees, and point out things, visually, to them. Circle which lever to pull, and using Apple's ARKit[] means the circle stays on the right lever, even when the phone is jostled. It's a cool use case, though their UX is horrible.
IMO the problem is a hand-held smartphone is an awkward AR platform.
[0] Note that there's some vendor lock-in by Apple. The Vuforia Chalk feature I mentioned above is image stabilization and an overlay. Image stabilization (and face detection) is an old problem in computer vision, it's commonly used to add silly hats to faces in video calls.
> IMO the problem is a hand-held smartphone is an awkward AR platform
...doooh! ARKit might be good, but it needs to be coupled with a google-glass-successor. Imo GG was the "nokia 900" of its era, proof that it can be done and there's sort of a market, but worthless without something to make it worth the inconveniences for regular users.
To be honest I'd bet on a social-media experienced company like Facebook or on whoever does a good job at partnering with a social media giant because the only AR applications I'd imagine compelling enough for "average people" will involve social things, like walking around a bar and seeing fb and twitter and maybe tinder profiles of people around you near their heads like game stats, sourced based on facial recognitions from people who set themselves to "open"... will be creepy beyond belief, but I'd bet on a mix of social, mildly sexual, with a dash of gaming thing, like a mix of tinder and pokemon go. Now, god knows who has the chops to pull that one right without getting it too creepy...
If you have read access, then yes. Conventional desktop and server linux distributions would allow this behavior. As does android. Good luck using dylibs without it, anyways.
Since the android market is so fragmented and customized, this probably saves them from having to buy lots of phones when diagnosing crashes.
The knee-jerk reaction is to feel uncomfortable but these are system files, shipped with the phone, that are accessible to anyone who purchases the phone. This saves FB the trouble of spending $200 every time a new OS update comes out. Personally, with that knowledge, I don't have a problem with this - however, I have a ton of problems with other stuff FB does so I'm happy to keep not using their service.
> If you have read access, then yes. Conventional desktop and server linux distributions would allow this behavior.
The difference is in people's expectations of mobile vs. desktop apps. You'd never install untrusted software on your desktop, but mobile OSes provide the sense that software is isolated. In Android, that's mostly an illusion.
I feel like users install untrusted software on the desktop all the time and it's called closed source software.
It's not like Facebook is some small, unknown malware peddler so that its software should be considered "untrusted". If anything, it's untrusted because it's coming from a scummy company and opaque (due to being closed source).
You're right that it being from Facebook makes things a little different. At the same time, I've never needed to install a native desktop app from Facebook and I'd have some suspicion about doing so if such a thing existed, for exactly this reason.
> The difference is in people's expectations of mobile vs. desktop apps. You'd never install untrusted software on your desktop
I knew many Linux desktop users who had installed the Slack client back in the days we used Slack at work. Myself I have installed Skype. Not that I find Skype particularly good, but sometimes I need to communicate with people who have no clue about software freedom.
So, yes the number of "untrusted apps" is significantly lower on a (Linux) desktop, but "you'd never install" is an incorrect characterization.
I'm not making a moral judgement (FB is a big yikes), just technical. They'd have to:
- build lists of every phone, including carrier variant and internal revisions (pretty common!), to make sure they could be sure they had a complete library
- rely on the manufacturer to publicly post the ROM (cheaper mfg wont do this) (or somehow retrieve the URL from the update mechanism, said URL not easily accessible from userspace)
- handle the multiple different packaging mechanisms that android phones, especially older versions use (Google has gone a long way in remediating this but FB has to support billions of devices that don't adhere to best practices).
- For ROM packages that are encrypted, they'd need to acquire the keys from real devices.
- and they still would not have visibility into non-posted firmware, such as factory versions with day 1 upgrades (aka many many devices)
1. Uploading files from the user phone to their servers is straight up copyright violation in plenty of cases.
2. I have doubts that you need copies of all kinds of system libraries to debug that crash. They won't help you debug a crash dump (assuming they don't have debug symbols left in for some reason). They generally won't help you reproduce the crash unless you actually know reproduction steps - it wouldn't surprise me if they tracked every user action, but I doubt they do - so it takes many of those crashes to even start debugging. At that point you probably know precisely which library you need and can obtain it legally.
That said, I agree that uploading the files themselves is not necessary to fingerprint users (the hashes would totally suffice). Unless they do the uploading as a cover-up story, which doesn't make much sense either.
At the very least, the privacy-respecting solution would be to upload hashes and only upload libraries once some critical mass of users had reported the hash along with a bug. Even then, you would only upload the files themselves from some capped number of users.
But...what about my pitchfork? The knee-jerk reaction to every Facebook blog spam entirely diminishes the harm they've done to nations around the world.
Yeah sorry, they could send ro.build.fingerprint instead if they really wanted to know what version of builds and devices out there are causing issues.
I can see this as an opt-in but not as a silent, default behavior.
I have moderate hearing damage in one of my ears, mostly in the midrange according to audio graphs. Too many raves without hearing protection.
I have lost the ability to hear people in bars, or the 200sqft tile room where the elevators are at work, or when people are holding multiple conversations in a conference room.
I'm fortunate that I was able to identify the problem and prevent it from getting worse. It's amazing how just a small amount of missing information has effectively broken the "noise canceling and directional isolation" "software" of my brain.
This will sound paradoxical but... Wearing earplugs can really help.
Hearing loss isnt so much an inability to hear "quiet" (despite how hearing tests are performed) as it is a los of dynamic range. By wearing earplugs you can lower the volume of all conversations and prevent "clipping", so to speak.
Try it! Next time you're having trouble hearing someone try to plug your ears with your fingers.
An older guy from the rave scene once taught me this without explaining the process at all.
When we'd be out and loud music was playing and somebody couldn't hear what he was saying to them, he'd lean in, speak more softly and place a finger lightly on the front part of their ear (so as to not put his finger in their ear).
He was always much easier to understand one he had done that, and we all picked up the habit. We weren't sure if the finger acted as a conductor, or what the explanation was, but reducing the dynamic range seems like a really good explanation.
I actually do. I have a couple of different silicone plugs with different capabilities, including hifi, low att, etc.
They work, but it's annoying wearing them in all the time. I also have Sony XM3 adjusted to reduce most (not full) noise with anc and that helps quite a bit as well.
I have big problems understanding conversations in bars or larger groups where other people seem to be able to follow without problems. But based overt hearing test I have taken my hearing is actually very good, way better than average. I believe that somehow my brain doesn’t assemble the input correctly so even though the right signals are there I am not using them.
Car manufacturers of luxury cars shy away from Android Auto partially because they believe it turns one of their main selling points and silos (advanced technology) into yet another commodity.