2024-06-04T18:13:30+00:00 | 🔗
@GregKamradt "Please get me the price of a carton of eggs at the closest grocery store to [insert physical location]" "Is it raining where you are right now?"
2024-06-04T04:23:45+00:00 | 🔗
The long short road from trying to do RAG to trying to ground LLMs to appreciating the Federal Rules of Evidence to understanding how reaching truth requires a human in the loop. The thing that cannot be faked is a human on the stand under penalty of perjury.
2024-06-04T04:14:47+00:00 | 🔗
@raunakdoesdev claude w a sychophantic neuron perpetually activated, I heard it on the NYT Hard Fork from Josh Batson at Anthropic, surprised it isn't more widely reported https://t.co/LJ4DI0shyo
2024-06-04T03:40:27+00:00 | 🔗
GenAI for coding is not so different to React, which is why Facebook loves releasing it. More genAi = more advertising; google will do great too.
2024-06-04T03:39:12+00:00 | 🔗
Golden gate Claude and emperors new Claude show how closely tied alignment and capabilities research. OpenAI’s disregard for alignment will be its downfall in the race to superintelligence but its focus on the developer experience will lead it to great success commercially.
2024-06-04T03:37:43+00:00 | 🔗
Training and inference should be directly coupled. There will be no loading weights in and out of ram. One chip, one model. Models will be fine tuned directly on chip with all weights updated and stored on chip. It wont be possible to directly update weights without just training
2024-06-03T16:40:02+00:00 | 🔗
RT @netcapgirl: “ai is about to take everyone’s jobs” the ai: https://t.co/rKXeOtiDA0
2024-06-03T15:13:52+00:00 | 🔗
Training and inference are intrinsically coupled. Looking at people as a guide, most training is done by the employer on the job. General purpose training costs, e.g. private school, tutoring and higher ed, have ballooned. https://t.co/LYCaSF2D3v
2024-06-01T03:14:14+00:00 | 🔗
RT @abacaj: GPT-4 wrapper startups taking money from VCs https://t.co/DlJAfGdXNw