• 0 Posts
  • 23 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle

  • First, I must confess that over the past few years I have been gravely disappointed with the white moderate. I have almost reached the regrettable conclusion that the Negro’s great stumbling block in his stride toward freedom is not the White Citizen’s Counciler or the Klu Klux Klanner, but the white moderate, who is more devoted to “order” than to justice; who prefers a negative peace which is the absence of tension to a positive peace which is the presence of justice; who constantly says: “I agree with you in the goal you seek, but I cannot agree with your methods of direct action”; who paternalistically believes he can set the timetable for another man’s freedom; who lives by a mythical concept of time and who constantly advises the Negro to wait for a “more convenient season.” Shallow understanding from people of good will is more frustrating than absolute misunderstanding from people of ill will. Lukewarm acceptance is much more bewildering than outright rejection.

    Take a moment, a deep breath, some fresh clean air, and think about why you’re putting so much energy into saying… I dunno, all of the things that you’re saying.

    If this 4B thing were about liberating women from a literal slavery, if they falsly identified you as one of those nasty republicans, if they really did mean absolutely no men whatsoever: is it worth all of this anger you’re feeling? Is your quabble with them over your own love life more important than their fight for freedom? Do you not agree that they should be free?




  • The paradigm shift system also introduces this… I dunno, ducking and weaving style gameplay? It’s like you’re the director of an orchestra looking for the right musical swell at the right time.

    This paradigm shifting is the same kind that you do in other games when a party member needs to stop and focus on healing, but now that you have to shift your entire team’s focus, while keeping in mind that each role really needs time and momentum to truly be effective, you end up making these real-time opportunity cost decisions about which urgent thing needs the most attention, or whether you can split your focus even though a team that can do this is much weaker at both things it’s trying to accomplish. I really like the way 13 forces you to think about party formation.

    I also give it credit for establishing the stagger meter, which was such a good idea that they’ve included it in like every game since then.



  • I actually really liked 16’s main storyline. Not sure where I rank it, exactly, but parts of it were extremely cool.

    What I did not like were the barrel-bin jrpg-tier sidequests where characters show up out of the blue because they’re supposed to be in this scene and “you really thought I wouldn’t see the two of ya’s slinkin’ off” was all I guess the project had the budget for.

    I can’t tell you how many times it felt like a character would tell me to go somewhere to do a thing because they can’t go, and so I’d go do it, only for them to show up anyway so they could thank me with sad music.

    It was just exhausting how shallow and uninspired most of the side content was.


  • Funny enough, 13 is actually the one I’ve replayed the most. I think I’ve beaten it like 3 different times, in addition to whatever runs I didn’t finish. It’s kind of grown on me as one of my favorite ones.

    Do be ready for about 40 hours of single-path walkways if you ever go back, though. I don’t actually think this is the problem some people make it out to be, but the game isn’t polarizing for no reason.



  • Ah, but here we have to get pedantic a little bit: producing an AGI through current known methods is intractable.

    I didn’t quite understand this at first. I think I was going to say something about the paper leaving the method ambiguous, thus implicating all methods yet unknown, etc, whatever. But yeah, this divide between solvable and “unsolvable” shifts if we ever break NP-hard and have to define some new NP-super-hard category. This does feel like the piece I was missing. Or a piece, anyway.

    e.g. humans don’t fit the definition either.

    I did think about this, and the only reason I reject it is that “human-like or -level” matches our complexity by definition, and we already have a behavior set for a fairly large n. This doesn’t have to mean that we aren’t still below some curve, of course, but I do struggle to imagine how our own complexity wouldn’t still be too large to solve, AGI or not.


    Anyway, the main reason I’m replying again at all is just to make sure I thanked you for getting back to me, haha. This was definitely helpful.


  • Hey! Just asking you because I’m not sure where else to direct this energy at the moment.

    I spent a while trying to understand the argument this paper was making, and for the most part I think I’ve got it. But there’s a kind of obvious, knee-jerk rebuttal to throw at it, seen elsewhere under this post, even:

    If producing an AGI is intractable, why does the human meat-brain exist?

    Evolution “may be thought of” as a process that samples a distribution of situation-behaviors, though that distribution is entirely abstract. And the decision process for whether the “AI” it produces matches this distribution of successful behaviors is yada yada darwinism. The answer we care about, because this is the inspiration I imagine AI engineers took from evolution in the first place, is whether evolution can (not inevitably, just can) produce an AGI (us) in reasonable time (it did).

    The question is, where does this line of thinking fail?

    Going by the proof, it should either be:

    • That evolution is an intractable method. 60 million years is a long time, but it still feels quite short for this answer.
    • Something about it doesn’t fit within this computational paradigm. That is, I’m stretching the definition.
    • The language “no better than chance” for option 2 is actually more significant than I’m thinking. Evolution is all chance. But is our existence really just extreme luck? I know that it is, but this answer is really unsatisfying.

    I’m not sure how to formalize any of this, though.

    The thought that we could “encode all of biological evolution into a program of at most size K” did made me laugh.




  • Imaginary grenades.

    Having porn made of you is imaginary?

    At some point the “it’s just a game” also stops holding water…

    The video game doesn’t produce anything.

    AI is not the cause for generating deep fakes,

    DUIs can be reduced with public transportation. What do you propose reduces… porn fakes?

    Ain’t it interesting how coming up with a consistent framework, makes it applicable to different areas of life?

    Fucking lol.

    My problem with machine learning porn is that it’s artless generic template spam clogging up my feed of shit I actually want to see. But you know, to each their own.







  • And how ads on TV are sometimes so much louder than the show they’re cut between. And the glitches! Sometimes, you have to completely power cycle your phone to fix something simple. And how Facebook’s curated, algorithmic feed sends people down extremist pipelines, fueling things like public shootings and the January 2021 Capital riots. And how the continued atomization of society into smaller and smaller pieces (e.g. suburbia) has made people lonelier than they ever have been. And how the displacement of work onto capable machines never seems to yield benefits onto the people whose work is being displaced, only their bosses.

    I guess if all you remember are Letterman’s fumbling grandpa jokes about what the Internet is, gosh dang, even useful for, I could see why you’d think nobody’s criticisms are real.