• 0 Posts
  • 11 Comments
Joined 1 year ago
cake
Cake day: November 5th, 2023

help-circle




  • https://x.com/OpenAI/

    This is one small example, but I get notifications on developer livestreams for new models and new API updates and feature releases. The OpenAI sub itself is not only too many hours late in publishing any of them, but it’s also only a fraction of the updates coming directly from the company itself. This extends to many other orgs and people I follow.

    I’m a developer so I like to have quick access to new info to many frameworks and languages (and other lead devs that post updates.)



  • X is where people I want to follow post unfortunately. If they posted on mastodon, I would use that more. As it stands, a lot of people and creators I want to keep up with are only on a few select platforms at the moment. Maybe that’ll change in time but I doubt anytime soon. Same situation with YouTube, I’d like to stop using that too but it’s the only place to find certain things (small example: individual magicians who sometimes perform on Penn & Teller also post their own videos on YT only.)



  • One of the major breakthroughs wasn’t just compute hardware, it was things like the “Attention Is All You Need” whitepaper that spawned all the latest LLMs and multi-modal models (video generation, music generation, classification, sentiment analysis, etc etc.) So there has been an insane amount of improvement on the whole neural network architectures themselves. (LSTM, Transformers, recurrent neural nets, convolutional neural nets, etc.) RNN’s were 1972, LSTMs only came out in 1999 come to find out.

    2009-2011 was when we got good image recognition. Transformers started after the Attention whitepaper in 2017. Now the models are improving themselves at this point, singularity is heading our way pretty quickly.