It’s not always easy to distinguish between existentialism and a bad mood.

  • 6 Posts
  • 183 Comments
Joined 2 years ago
cake
Cake day: July 2nd, 2023

help-circle




  • To get a bit meta for a minute, you don’t really need to.

    The first time a substantial contribution to a serious issue in an important FOSS project is made by an LLM with no conditionals, the pr people of the company that trained it are going to make absolutely sure everyone and their fairy godmother knows about it.

    Until then it’s probably ok to treat claims that chatbots can handle a significant bulk of non-boilerplate coding tasks in enterprise projects by themselves the same as claims of haunted houses; you don’t really need to debunk every separate witness testimony, it’s self evident that a world where there is an afterlife that also freely intertwines with daily reality would be notably and extensively different to the one we are currently living in.



  • Given the volatility of the space I don’t think it could have been doing stuff much better, doubt it’s getting out of alpha before the bubble bursts and stuff settles down a bit, if at all.

    Automatic pr generation sounds like something that would need a prompt and a ten-line script rather than langchain, but it also seems both questionable and unnecessary.

    If someone wants to know an LLM’s opinion on what the changes in a branch are meant to accomplish they should be encouraged to ask it themselves, no need to spam the repository.












  • Maybe It’s just CEO dick measuring, so chads Nadella and PIchai can both claim a rock hard 20-30% while virgin Zuckeberg is exposed as not even knowing how to put the condom on.

    Microsoft CTO Kevin Scott previously said he expects 95% of all code to be AI-generated by 2030.

    Of course he did.

    The Microsoft CEO said the company was seeing mixed results in AI-generated code across different languages, with more progress in Python and less in C++.

    So the more permissive at compile time the language the better the AI comes out smelling? What a completely unanticipated twist of fate!




  • Conversely, people who may not look or sound like a traditional expert, but are good at making predictions

    The weird rationalist assumption that being good at predictions is a standalone skill that some people are just gifted with (see also the emphasis on superpredictors being a thing in itself that’s just clamoring to come out of the woodwork but for the lack of sufficient monetary incentive) tends to come off a lot like if an important part of the prediction market project was for rationalists to isolate the muad’dib gene.