2024-03-07T16:51:18+00:00 | 🔗
I’m rolling. In the video, the Trump lawyer got really flustered and was like … financial compensation, I mean I don’t want to speculate but ya know the Uber they were in, he probably paid for that too” https://t.co/Fk9ovOHyzd
2024-03-07T15:02:39+00:00 | 🔗
How much of this AI picks and shovels stuff is just SVB? E.g. having a commodified product serving an industry that is highly correlated and coordinated. Wondering as I see people effortlessly switch and switch and switch.
2024-03-07T14:57:29+00:00 | 🔗
Generative AI is inherently personal. As we guide the Generative AI to something truly creative, we are guiding it to a hitherto unknown region. This guidance and the personal cost to act on it, throttles the extractive value by the “AI provider”.
2024-03-07T14:35:04+00:00 | 🔗
I mean yeah, anyone curating (assembling?) these datasets is making an explicit decision on how self-aware it SEEMS. I don't see anything wrong with that, just like how I don't see how these models are actually self-aware either. https://t.co/tQXuII9N48
2024-03-07T13:19:27+00:00 | 🔗
@aosuenth @roydanroy Yeah but I think relative to humans it rounds to zero. I propose the agency-ometer. Rock=1, Tree=10, Cow=100, Human=1000 On vibes, LLM is a 3, with functions and multi-turn currently maybe a 5. I would put Waymo at like 70. Random forests at like 2.
2024-03-07T01:38:35+00:00 | 🔗
@nsthorat @modal_labs Same, for example, their secrets management (which I just configured for a new client)... amazing! I just want to buy secondaries somewhere
2024-03-06T19:35:30+00:00 | 🔗
@RDBinns Thanks, I’m still taking classes, it’s very interesting stuff. We are reading Wittgenstein rn and I enjoyed modern metaphysics and logic from the semester prior.
2024-03-06T19:28:35+00:00 | 🔗
@roydanroy If it helps I voted no, I do think there is something distasteful about the entire idea of being self-aware, so I wouldn’t ascribe too much to any discussion of self-awareness.
2024-03-06T19:08:28+00:00 | 🔗
@roydanroy Non-starter because I can’t be cloned. Suppose I can. The two clones surely wouldn’t be the same self. I couldn’t collect the experiences of my two clones into one self. So each time we infer on Claude3 they are on different clones we that can’t aggregate into one self.
2024-03-06T18:49:43+00:00 | 🔗
@roydanroy Being self aware is to be aware of self which requires a self which Claude-3 lacks. It can be printed out, duplicated, sharded, paused, tuned, has system prompts and has tokenizers. Whether being self-aware is a meaningful question is something altogether different.