AI replacing programming

What does the future of programming actually look like?

Hello everyone, it's Ben's Bites, your daily AI whack-a-mole. As soon as you think you're on top of everything, something else pops up!

Let's get to it.

๐ŸคŒ Ben's Picks

This article has been doing the rounds, "The end of programming" (link). The author, who has a Ph.D. in CS from Berkeley, argues that programming will become obsolete as AI systems trained rather than programmed become the norm. In fact, the author predicts that even "simple" programs in the future will be generated by AI rather than written by hand. I believe that the majority of software creation will not require knowledge of how to write code in the future.

Semi-related;

Observations from spending Christmas programming with ChatGPT. A key point made here is that you still need to have some coding knowledge to really benefit the most from using ChatGPT in this way (for now!). Much like an artist will likely produce better AI generated images than I'd be able to. (link)

Zoomscape - Create stunning Zoom backgrounds with AI. (link)

๐Ÿ› ๏ธ Cool Tools

  • Better Call Bloom! - Ask AI any legal question, based on the open-source model BLOOM. (link)

  • ChatSonic API - A ChatGPT-like API with up-to-date information from Google, digital art and paintings, personalised avatars, and more! (link)

  • ROME - Create a personal avatar from just a single image. (link)

  • Tensai - Conversational UI for your codebase. (link)

  • Rizz! - The world's most powerful AI, built into your keyboard. (link)

  • qqbot - a variant of ChatGPT that lives in your IDE. (link)

๐ŸŽ“ Learn

  • AI for game development - creating a farming game in 5 days - part 1. (link)

  • Welcome to Intelligent Automation - lessons in automation, AI, and low/no code. (link)

๐Ÿ”ฌ Research

Did you know that we can compress large language models to make them more efficient? A paper called "SparseGPT: Massive Language Models can be Accurately Pruned in One Shot" talks about it. We can eliminate 50% of the weights in a 175B param model without losing much performance. This is called "model compression" and it's all about reducing the amount of space and computation needed to run a model. Some techniques focus on reducing the precision of the variables that hold the weights, while others "prune" the weights or convert the model to a sparse representation with mostly 0s.

In this paper, the authors show a way to sparsify the weight matrices for really big models with 180B parameters. They use a clever algorithm to get rid of more than half of the weights in these models at different scales, all in one go. No need for any fine-tuning afterwards. Just run the routine on an off-the-shelf model and you're set.

Here's the surprising part: the larger models are actually easier to sparsify. In other words, they lose less accuracy when you eliminate a certain amount of weights. And it turns out that the redundancy of parameters increases as the size of the LLM grows!

Overall, this is really encouraging news. While the biggest LLMs still require expensive hardware to run, techniques like this might allow us to cut down from 8 GPUs to 4 for a 180B parameter model (for now). It's kind of like how "Stable Diffusion" made image generation more accessible with a clever technique for smaller models, and now we're seeing something similar with LLMs. (source)

Massive language models can be accurately pruned in one shot. (link)

  • ConvNeXt V2: Co-designing and scaling ConvNets with masked autoencoders. (link)

  • Muse: a text-to-image transformer model, more efficient than diffusion or autoregressive models. (link)

  • Diffusion probabilistic models for scene-scale 3D categorical data. (link)

  • Argoverse 2: next-generation datasets for self-driving perception and forecasting. (link)

๐Ÿ‘‹ Too many links?! I created a database for all links mentioned in these emails. Refer 1 friend using this link and I'll send over the link database.

๐Ÿค“ Everything else

  • Ahead of AI #4: A big year for AI. (link)

  • Perfect AI fingers with Protogen Stable Diffusion model. (link)

  • ChatGPT failures - GitHub repo containing failure cases for ChatGPT. (link)

  • Awesome ChatGPT Prompts - a dataset of prompts to make chatGPT behave in a certain way. (link)

  • ChatGPT vs Google Search: What you should know. (link)

  • How China is building a parallel generative AI universe. (link)

๐Ÿ–ผ AI images of the day

๐Ÿค— Share Ben's Bites

Send this to 1 AI-curious friend and receive my AI project tracker database!

or copy/paste this link: https://bensbites.beehiiv.com/subscribe?ref=PLACEHOLDER

๐Ÿ‘‹ See ya

โญ๏ธ How did we do?

How was today's email?

Login or Subscribe to participate in polls.

โญ๏ธ REAL REVIEWS

Join the conversation

or to participate.