brain garbage: Word Shart Online edition

Microsoft is short for “Mike Rowe and His Soft Boys”

6 Likes

5 Likes

Cus 7 8 9

Currently there’s a man feverishly typing into a computer at Walt Disney corporation. He keeps asking ChatGPT to generate an entire script for Toy Story Six

4 Likes

I remember working at Popcap like 10 years ago and after EA bought them and Peggle for mobile flopped they were like “no more games except Plants Vs Zombies and Bejeweled” and then we got Plants Vs Zombies Heroes, a card battler game, and Bejeweled…something…or another (Bejeweled Skies I think?). We had a goat game that was pretty far along that was really cute and fun, but of course it didn’t fit the metrics or whatever so it was cancelled despite everyone loving it. I wonder if Pixar is like that now, nothing but Cars and Toy Story sequels forever.

9 Likes

everything about pixar movies is so precisely calculated for maximum marketable sterility, that i don’t think they’d risk using a chatbot to knock one out

1 Like

Gas station olive oil.

5 Likes

Pennezoil

4 Likes

13 Likes

floW ratiuG

5 Likes

DDLG allin

4 Likes

BBL Allin

2 Likes

“Good game” allin

6 Likes

Massive equivocations going on between “is consciousness physical”, " is consciousness computational", and “can consciousness be done with computers”. Answers are yes 99.999999999_% certainty, no or at most partially 98% certainty, yes 99.999999999_%. Really the compute cap on ability to simulate brains just means:

1: p zombies are impossible
2: consciousness is not ontologically computational in most or all domains, it can only be modeled as computational: all physical processes can be modeled as computational so this is a ridiculous thing to be causing confusion. Hasn’t anyone asked why, if physical processes substitute for computation, we don’t just use efficient physical processes as substitutes for inefficient long form computation? Why is the goal the other way around, to take simple physical processes and translate them into literal computation?
3. Consciousness can never be " simulated" with computers, it can only be DONE with computers, and their being computers is of marginal importance to this at best. Full panpsychism has not been ruled out so computers might all be conscious already, but more hopefully EM field theory is correct and a conscious computer is just a computer of any complexity that interfaces with electromagnetic fields.

2 Likes

semi-researched brain garbage: i was thinking about information density as some sort of measurable property of data, and how it sort of converges at meaningless at both ends of the complexity scale

the thing that brought this up was the “compression challenge” which (in my memory) states that anyone who can compress a file made of purely random noise where the compressed file and the program that decompresses it is less size than the original file. i tried to research this more but i kept getting confused at the numbers so i got bored

anyway my thought was that a string of a million 0’s is about as meaningful as a string of a million numbers that are either 0 or 1, and so there must be some way of measuring this sort of…important information density? that would make things at both ends of the scale meet up.

i looked at kolmogorov complexity but this seems to describe something that only increases as randomness increases in my example. so that would mean that a million 0’s is very low, and a million random numbers is very high.

but you can get a rough equivalent of a million random numbers with a very simple instruction: “flip a coin 1 million times”. and no, it would not be the same numbers, but it would equally meaningful as the original in every way.

anyway i didn’t find anything but surely it exists. i’m not nerdy enough to be able to understand any of these wiki pages past the first two sentences, so if there is a thing to describe this lemme know. and if not can we call it “Cania’s Distance?”

5 Likes

entropy is the usual measure for this, Shannon entropy to be specific. and that higher entropy means higher information.

information theory roughly states that data with high information needs a more complicated encoding scheme expression than low information data

so your examples:

  • a million 0s : 1000000 x [ 0 ]
  • a million random 0s or 1s : 1000000 x [ α ∈ { 0, 1 } ]

that second encoding isn’t preserving the contents of the data, so assumes the information is the length of the data. it has more information because its representation in my encoding scheme is longer

if you’re thinking, wtf no it doesn’t it’s just random data, I would say, that depends on the interpretation of the data according to the semantic system it’s participating in. and then refer to predicate calculus

4 Likes

severian should’ve had access to adderall

4 Likes

Heh nerds.
Just reduce the space between the numbers and use a smaller font.
EZPZ.

4 Likes

technically i suppose that if you decrease your font size from a double digit number to a single digit number, it could actually be a shorter file

2 Likes

GLHF Allin
BM Allin (Bad Mannersing Allin)

BL Allin (Boys Love Allin)
GM Allin (General Manager Allin)

5 Likes