• 0 Posts
  • 66 Comments
Joined 8 months ago
cake
Cake day: January 31st, 2024

help-circle

  • Assuming I’m an android fan for pointing out that Apple does shady PR. I literally mention that Apple devices have their selling point. And it isn’t UNMATCHED PERFORMANCE or CUTTING EDGE TECHNOLOGY as their adds seems to suggest. It’s a polished experience and beautiful presentation; that is unmatched. Unlike the hot mess that is android. Android also has its selling points, but this reply is already getting long. Just wanted to point out your pettiness and unwillingness to read more than a sentence.



  • Dang, OpenAI just pulled an Apple. Do something other people have already done with the same results (but importantly before they made a big fuss about it), claim it’s their innovation, give it a bloated name so people imagine it’s more than it is and produce a graph comparing themselves to themselves, hoping nobody will look at the competition.

    Just like Apple, they have their own selling point, but instead they seem to prefer making up stuff while forgetting why people use em.

    On a side note they also pulled an Elon. Where’s my AI companion that can comment on video in realtime and sing to me??? Ya had it “working” “live” a couple months ago, WHERE IS IT?!?



  • LANIK2000@lemmy.worldto196@lemmy.blahaj.zoneAndre
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    9 days ago

    The thing is this wasn’t “most men” or even “some men”. It was a regular line for a mountain coaster. Anything from kids to their moms and dads and older folk up to like their 50s and all fucking cheered. I’ve never see something like this here. Here there’s always at least one old cranky dude that shits on such people very vocally and the kids cover. 5 teenagers gawking and giggling or football fan types yelling isn’t the same as watching a crowed of all ages go WOOOW! Like I’m already fed up with the heat and shitty long line on the fucking asphalt of all things and these regular ass midwestern families still find it in em to admire the rich asshole being a nuisance.



  • LANIK2000@lemmy.worldto196@lemmy.blahaj.zoneAndre
    link
    fedilink
    arrow-up
    7
    arrow-down
    3
    ·
    9 days ago

    After visiting America and seeing how much they celebrate assholes I’m less and less surprised. If you’re standing in line and a car pulls up just to rev its engine as loud as possible, here in Europe we get mad at the fucking obnoxious asshole. Never ever did I want to disassociate from a crowd faster than when surrounded by Americans losing their marbles at seeing an expensive car be loud.

    It’s a tiny example, but it has to start somewhere. Glorifying shit is just wrong.


  • This process is akin to how humans learn…

    I’m so fucking sick of people saying that. We have no fucking clue how humans LEARN. Aka gather understanding aka how cognition works or what it truly is. On the contrary we can deduce that it probably isn’t very close to human memory/learning/cognition/sentience (any other buzzword that are stands-ins for things we don’t understand yet), considering human memory is extremely lossy and tends to infer its own bias, as opposed to LLMs that do neither and religiously follow patters to their own fault.

    It’s quite literally a text prediction machine that started its life as a translator (and still does amazingly at that task), it just happens to turn out that general human language is a very powerful tool all on its own.

    I could go on and on as I usually do on lemmy about AI, but your argument is literally “Neural network is theoretically like the nervous system, therefore human”, I have no faith in getting through to you people.


  • 80% of my programing work is solving problems and designing stuff. The only productivity boost I got is when working with proprietary libraries that have most of their documentation in customer support tickets (wouldn’t be a problem if I could just read the bloody source code or our company didn’t think that paying UNHOLY AMOUNTS OF MONEY for shit makes it better) or when interacting with a new system, where I know exactly what I want, but just don’t know the new syntax or names. It’s handy, but definitely not a game changer.


  • LANIK2000@lemmy.worldto196@lemmy.blahaj.zonePoorly socialized rule
    link
    fedilink
    arrow-up
    20
    arrow-down
    8
    ·
    18 days ago

    As a man I have to agree, we’re fucking spoiled, but in the worst way possible. It’s not just that we allow men to be pigs and monsters, society expects it of us as kids, and if we don’t behave like it, we get called weird and gay and pushed aside as freaks. In my childhood it was very much be shit or be treated like shit. It really fucked my confidence and effectively ruined 90% of interactions with women, as I became to scared to even look one in the eyes, not wanting to be seen as a threat again for breathing the same air. As many men, it’s a miracle I found someone (or rather I was lucky enough to have a good friend that would introduce me) and thank fucking god, because I was slowly turning into an incel… Now I actually get to be a functional member of society and make someone happy.

    It always triggers me a bit when I complain about me not having showered/shaved/groomed my self yet and a women tells me “you don’t have to, you’re a man”. I understand your daily struggle and injustice as a women and that you have it much worse, but what am I to do? Cut off my dick? Shrivel up and die? Maybe then I can become a vile enough rotting lump of shit to be a “REAL MAN”.


  • Except for their low draw and thus unmatched battery life on portable devices, the M chips are honestly not impressive performance wise. Not really the appeal, even tho Apple is trying tooth and nails to pretend that that’s a selling point with their unlabeled graphs.

    I mean if you really don’t want a GPU (which IMO is a must, given proper hardware acceleration which makes up for any CPU short comings, but I digress), that leaves you with a much bigger budget for the CPU, and now it’s no longer close enough to the M chips, but an absolute slaughter.


  • I don’t want to downplay or invalidate any of your preferences, but you HEAVILY miss represent the competition. Have you seen a non apple device in the past 5 years?

    Other companies make metal body PCs now. From the dinky cheap ass laptop I bought just for fun, to my sister’s proper gaming laptop, there’s plenty of metal+glass laptops out there. And when it comes to android I only really follow Samsung, Sony and Google, but at least those 3 have had metal+glass flagship phones since I care to remember. (looked it up, Sony: 2013, Samsung: 2015, Google: 2018)




  • Calling the reward system hormones, doesn’t really change the fact that we have no clue where to even start. What is a good reward for general intelligence? Solving problems? That’s our current approach, which has the issue of the AI not actually understanding the problems and just ending up remembering question answer pairs (patterns). We need to figure out what defines inteligence and “understanding” in an easily measurable way. Which is something people knew almost a hundred years ago when we came up with the idea of neural networks, and why I say we didn’t get any closer to AGI with LLMs.


  • In theory. Then comes the question of how exactly are you gonna teach/train it. I feel our current approach is too strict for proper intelligence to emerge, but what do I know. I honestly have no clue how such a model could be trained. I guess it would be similar to how people train actual braincells? Tho that field is very immature atm… The neat thing about the human brain is, that it’s already preconfigured for self learning, tho it does come with its own bias on what to learn due to its unique needs and desires.



  • You can think of the brain as a set of modules, but sensors and the ability to adhere to a predefined grammar aren’t what define AGI if you ask me. We’re missing the most important module. AGI requires cognition, the ability to acquire knowledge and understanding. Such an ability would make larger language models completely redundant as it could just learn langue or even come up with one all on its own, like kids in isolation for example.

    What I was trying to point out is that “neural networks” don’t actually learn in the way we do, using the world “learn” is a bit misleading, because it implies cognition. A neural network in the computer science sense is just a bunch of random operations in sequence. In goes a number, out goes a number. We then collect a bunch of input output pairs, the dataset, and semi randomly adjust these operations until they happen to somewhat match this collection. The reasoning is done by the humans assembling the input output pairs. That step is implicitly skipped for the AI. It doesn’t know why they belong together and it isn’t allowed to reason about why, because the second it spits out something else, that is an error and this whole process breaks. That’s why LLMs hallucinate with perfect confidence and why they’ll never gain cognition, because the second you remove the human assembling the dataset, you’re quite literally left with nothing but semi random numbers, and that’s why they degrade so fast when learning from themselves.

    This technology is very impressive and quite useful, and demonstrates how powerful of a tool language alone is, but it doesn’t get us any closer to AGI.