Jump to content

zkom

Members
  • Posts

    5,665
  • Joined

  • Last visited

  • Days Won

    11

Everything posted by zkom

  1. I asked the GPT-3 to write a black metal song about cute ponies
  2. I'm prototyping the OpenAI GPT-3 models with my IRC bot. It's going great:
  3. Found this categorization in a comment for a Trailer Park Boys youtube video
  4. Yeah, Žižek has been surprisingly level headed during the whole war. Here's another text back from May: We must stop letting Russia define the terms of the Ukraine crisis https://www.theguardian.com/commentisfree/2022/may/23/we-must-stop-letting-russia-define-the-terms-of-the-ukraine-crisis
  5. Well, there's ambient, chill-out, psydub, synthwave, vaporwave and so on. Lots of electronic music just for listening.
  6. I never use it in casual conversation because it is so cringey to try to explain. I just say I listen to electronic music mostly. It's also more accurate because I think like half of the music I listen to is ambient. Then it's maybe 25% "IDM" and the other 25% is various genres from old disco hits to dub to p-funk to drone metal and noise. I usually don't say I listen to ambient in casual conversation either because the usual next question is something like "it's that new age music right, like whale songs and shit?"
  7. I have this album on cassette. Haven't listened to it for ages though and I don't really know how I would feel about. It's a bit of 90s electronic music that's very weird from 2020s perspective.. I guess the message is positive or something? But how this is approached is like, eh.. I don't think this would fly today when people get upset about cultural appropriation etc. Somehow the version of Power of American Natives with the vocals is just way worse.
  8. It's not as sweet as you might think. Tastes more like a regular imperial stout with a hint of banana. So it's not like some pastry stouts that are super sweet and thick. Not too bad but I'll probably pick something else the next time. It wasn't a dessert really, I just went out for a single beer and selected a 10.3% 0.44l imperial stout.
  9. Yes, you can see it doesn't really understand what the images actually are supposed to represent. Like the phone displays are not showing the same thing that's happening in front of them or the gladiators seem to be missing legs, etc. It just has a massive amount of 2D images and their descriptions to build the statistical models on and no understanding of the content.
  10. Let me rephrase a bit. I wasn't merely meaning to point out that it's lying. What I mean it's telling about things that it has no experience of, because it really has no personal experience of anything. It talks about things that happened at some time in its imagined "past", but it has no experience of past, present or passage of time in general (the algorithm just runs when its queried something). Words representing time or anything really are just symbols without any relevance to its own experiences. It merely knows very complicated relationships between the words, not what the words would represent in its personal experience of the reality it exists in. But yeah, in general I think were mostly on the same page here. I agree with this. I think sometimes people confuse highly intelligent problem solving systems as being somehow conscious but they really are just very complicated automatons.
  11. I think the point is not that it doesn't know what the word means but it will tell you things about itself that aren't true. Its language model has information that when someone asks if you've been to a pub last week a valid answer is that "Yeah, I had really nice time". If you ask the AI if it's sentient it will just answer like a human would because that's what it's language model contains, not because there's any validity to that. If you ask it if it's sentient it would answer like a human would because that's what its model contains. Anyway, sentience is harder question than just if a given AI is sentient. We don't know that even for biological systems like animals for example. If you look at the simplest biological self-replicating things like viruses it looks pretty clear they aren't sentient. They are basically just automatons to insert genetic data inside living cells that then turn into factories to produce more viruses. Single cell organisms are a bit more complicated and have multiple parts, don't need a host and can have some very basic responses to their environment. Then you have multicellular things like plants and fungi that have some sorts of information networks but no nervous system. Then you get the simplest animals that have nervous systems and kind of very simple distributed brains like insects and worms. Then animals with simple brains, various reptilians etc. And up you go the ladder until you get complex mammals like humans that most people agree have sentience. So at which level from the simple genetic copy machine to a human the sentience appears? Is there a hard limit or is it more just a degrees of sentience? Also is it possible that superorganisms, like ant colonies that act like a single organism in their decision making, are sentient also? Are humans also part of bigger sentient superorganisms? Are cities or countries sentient? Are we all just brain cells?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.