• 0 Posts
  • 1.55K Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle
  • They went to great lengths to explain that and why a trailer load may transiently exceed it and used a 20 year old wrecked truck as a reference.

    The other concern they mentioned was aluminum characteristics over time. Brand new strength will not equal strength over time. So 10k pounds is the trucks strength at its absolute best, but it will degrade over time. Also the mix of metals may cause a galvanic reaction to degrade it over time. No one else in the industry will use aluminum for the frame, for good reason

    They even admit it fared better than they thought, but it’s another example of Tesla ignoring engineering principles and the predicted consequences being demonstrated.


  • Shareholders in a company whose entire business is around being well liked globally thinks it would be a bad business move to cancel a program that is part of a good public image in favor of placating a relatively small chunk of Americans that would both be paying attention and against DEI.

    They could still be pro-Trump, and not really caring about DEI, but they know the value of optics in the context of a company like Disney.


  • anti-DEI is pretty much code for “only white straight cisgendered males welcome”. Whatever criticisms may make sense against select DEI initiatives, the anti-DEI move by the government basically involved erasing all acknowledgement of any minority or woman ever being honored for accomplishments, no matter how obviously well earned the honor was.

    On a global perspective, the American flavor of “anti-DEI” panders to a relatively small group of folks at the expense of offending the vast majority of the world’s population.




  • I assume there’s a large amount of people who do nothing but write pretty boilerplate projects that have already been done a thousand times, maybe with some very milquetoast variations like branding or styling. Like a web form doing one to one manipulations of some database from user input.

    And/or a large number of people who think they need to be seen as “with it” and claim success because they see everyone else claim success. This is super common with any hype phase, where there’s a desperate need for people to claim affinity with the “hot thing”.







  • And because a friend insisted that it writes code just fine.

    It’s so weird, I feel like I’m being gaslit from all over the place. People talking about “vibe coding” to generate thousands of lines of code without ever having to actually read any of it and swearing it can work fine.

    I’ve repeatedly given LLMs a shot and always the experience is very similar. If I don’t know how to do it, neither does it, but it will spit out code confidently, hallucinating function names or REST urls as needed to fit the narrative that would have been convenient. If I can’t spot the logic issue with some code that isn’t acting correct, it will also fail to generate useful text that would describe the problem.

    If the query is within reach of copy/paste of the top stack overflow answer, then it can generate the code. The nature of LLM integration with IDEs makes the workflow easier to pull in than stack overflow answers, but you need to be vigilant as it’s impossible to tell a viable result from junk, as both are presented with equal confidence and certainty. It can also do a better job of spotting issues within things like key values that are strings with typo than traditional code analysis, and by extension errors in less structured languages like Javascript and Python (where ‘everything is a hash/dictionary’ design prevails).

    So far I can’t say I’ve seen improvements, I see how it could be seen as valuable, but the resulting babysitting carries a cost that has been more annoying than the theoretical time saves. Maybe for more boilerplate tasks, but generally speaking those are highly wrapped by libraries already, and when I have to create significant volume of code, it’s because there’s no library and if there’s no library, it’s niche enough that the LLMs can’t generate either.

    I think the most credible time save was a report of refreshing an old codebase that used a lot of deprecated function and changing most of the calls to the new method without explicit human intervention. Better than tools like ‘2to3’ for python, but still not magical either.







  • Autopilot is not FSD, but these scenarios are supposed to be within the capabilities of autopilot to react. There’s no indication that FSD is better equipped to handle these sorts of scenarios than autopilot. Many of the autopilot scenarios are the car plowing into a static obstacle head on. Yes the drivers should have been paying attention, but again, the point is autopilot even with all the updates simply fails to accurately model the environment even for what is should be considering easy.

    In terms of comparative systems, I frankly don’t know. No one has a launched offering, and we only know Tesla’s as well as we do because they opt to use random drivers on public roads as guinea pigs, which isn’t great. But again, this video demonstrated “easy mode” scenarios where the Tesla failed and another car succeeded. But all that’s beside the point, it’s not like radar and lidar would preclude fsd either way. The video makes clear the theory and reality of better sensing technology and it can only improve the safety of a system. FSD with added radar and lidar would have greater capacity for safety than FSD with just cameras. The lidar might be forgiven for cheap cars historically, but the radar is bonkers to remove as those are put on some pretty low end cars. No one else wants to risk FSD like capability without lidar because they see it as too risky. It’s not that tesla knows some magic to make cameras safe, they just are willing to inflict bigger risk, and willing to try to argue “humans are deadly too” whereas competition doesn’t even want to try that debate.


  • One, I don’t know if ‘autonomous no matter what’ is an important enough goal versus ADAS, but for another, the gold standard in the industry except Tesla is vehicle mounted LIDAR, with investments to bring down the tech price.

    Merging data from different sources was never claimed by anyone to be too hard a problem, again, even Tesla used to and decided to downgrade their capabilities for cost. “It’s just not worth it” is a strange take on a video demonstrating quite clearly the better data from LIDAR than you can possibly get from cameras and the benefit of avoiding collisions, collisions that kill thousands a year. Even the relatively “won’t turn on unless things are perfect” autopilot has killed quite a few people, and incurred hundreds of accidents beyond that.