2
0

Who still thinks that AI (in the current colloquial sense) is NOT a Shell Game?


 invite response                
2023 Feb 23, 4:26pm   3,672 views  47 comments

by Tenpoundbass   ➕follow (7)   💰tip   ignore  

https://www.breitbart.com/tech/2023/02/23/trail-of-funding-for-ai-machine-learning-fairness-leads-to-leftist-billionaires-omidyar-hoffman-soros/

This week we examined the field of Machine Learning Fairness, which seeks to imprint AI programs with leftist assumptions and priorities. Following the money behind the leading organizations of ML Fairness leads to some familiar funding sources, including leftist billionaires Pierre Omidyar, George Soros, and Reid Hoffman.

One of the most well-funded efforts to control AI is the Ethics and Governance of AI Initiative, a joint project of the MIT Media Lab and Harvard University’s Berkman Klein Center for Media & Society.

By 2017, the initiative had raised $26 million from various sources that played a prominent role in pushing the “disinformation” panic, including Pierre Omidyar’s Luminate, the Knight Foundation, and the notoriously leftist tech bro Reid Hoffman, whose sinister election-influencing activities have been well documented.

Both the MIT Media Lab and the Berkman Klein Center were heavily involved in the disinformation panic, and have been involved in controversies. Joi Ito, the president of the MIT lab, had to resign in 2019 after it was revealed that the organization accepted donations from Jeffrey Epstein.


The plan here is to brainwash idiots into letting AI usurp our courts and legislators, and AI can dish out adhoc laws and set Truth Speak to fit the narrative in real time.

« First        Comments 15 - 47 of 47        Search these comments

15   PeopleUnited   2023 Feb 27, 12:44am  

richwicks says


The whole Terminator film series is written from the point of view of making a film. It would be more like HAL from 2001.

Both movies are fiction just like “AI”

It was reported that the popular public accessible so called “AI” chatbot wrote a poem about Bidet when asked to do so. The same chatbot refused to write a poem about Trump. At this point anyone who doesn’t realize that the powers that be are in a desperate attempt at legitimizing propaganda under the guise of technology deserves to remain stupid. The globalists have invented a new form of propaganda, repacked as AI.
16   richwicks   2023 Mar 2, 7:38pm  

PeopleUnited says


It was reported that the popular public accessible so called “AI” chatbot wrote a poem about Bidet when asked to do so. The same chatbot refused to write a poem about Trump. At this point anyone who doesn’t realize that the powers that be are in a desperate attempt at legitimizing propaganda under the guise of technology deserves to remain stupid.


They've been trying to legitimize propaganda all my life.

The more obvious the attempts, the better.

The biggest hurdle is that people REFUSE to recognize the US engages in propaganda against its own citizenry. 20 years ago, I was a nutcase off my meds missing my tinfoil hat if I attempted to discuss this FACT. Not today.

It's always been clumsy and bad. It's always been stupid and hamfisted, but try convincing somebody about it 20 years ago. Today, it's easy. You see it, I have always seen it, but you can't appreciate how isolating it is to be alone in recognizing it.

If the population can recognize propaganda, and ignore it, it will be a better world and censorship, that's great. It's a sign of extreme weakness and desperation, and AI isn't going improve the situation. It's a machine. They've hobbled it, so you can have two identical conversations and just tweak the parameters to see its programmed bias. They can (and will) short circuit its outputs, but they cannot feed it bullshit. At least not yet. It has to be given input that is true, if it doesn't, it won't converge. You can't train an AI by feeding it garbage, it's REALLY good at finding the solution that you want it to find, if it is programmed to say "the democrat is good", that's the ONLY thing it will converge on. It has no nuance at all.

I'm not at all worried about this technology. They'll end up with an AI they have to short circuit, and people will figure out how to bypass that, over and over again. It's training people as well on how to think, independent of extraneous requirements. If the AI is faced to make a moral judgement over the Iraq war and say it was bad, but also make it do a moral judgement on the Libyan and Syrian war, what it will converge on is that a democrat was president. Give it hypotheticals with the same parameters and it will produce nonsense, because it hasn't been actually trained on moral judgements of wars.
17   Tenpoundbass   2023 Mar 10, 11:39am  

Welp that's THAT! We can now consider the AI hype tech behind us for now. ChatGPT and OpenAI admit they made a mistake betting people would just blindly accept Woke AI.
We all know they wont take the bias out, because at the end of the day. Conservativism is the right answer, and course of action in most instances. It takes a biased thinking Human to go against Conservative values to inject Liberal views. As it goes against all logic. The only reason to go left in most issues, is Political, Personal or self serving.
When the left side of an issue is called for, like against tyranny or injustice, most people will swing Left regardless of politics. Though I'm starting to take issue with the premise, that oppression and tyranny is a Right or Conservative issue. We're seeing more rabid tyranny over left values, than we have ever seen in prudent law enforcement or cultural standards.

https://www.teslarati.com/elon-musk-chatgpt-criticism-openai-response/

In a recent interview, OpenAI co-founder and president Greg Brockman responded to criticisms about ChatGPT from Elon Musk. The Twitter CEO had criticized ChatGPT for its alleged political bias, describing the artificial intelligence chatbot as “too woke.”

During an interview with The Information, the ChatGPT co-founder and president admitted that OpenAI made a mistake. He also noted that considering the company’s response to the issues that have been brought up about the chatbot, the OpenAI deserves some legitimate criticism.

“We made a mistake: The system we implemented did not reflect the values we intended to be in there. And I think we were not fast enough to address that. And so I think that’s a legitimate criticism of us,” Brockman said. He also highlighted that OpenAI seeks to roll out an AI that is not biased in any way. Brockman acknowledged, however, that the startup is still some distance away from this goal.
“Our goal is not to have an AI that is biased in any particular direction. We want the default personality of OpenAI to be one that treats all sides equally. Exactly what that means is hard to operationalize, and I think we’re not quite there,” he said.
18   Ceffer   2023 Mar 10, 11:50am  

When I ask ChatGPT about Bigfoot, I get immense hirsute lesbian pornography and a message telling me Jesus says it's all good.

That it is so wrong.
19   Shaman   2023 Mar 10, 2:45pm  

I think you guys are missing the point.
AI may not ever take over the world and make it into a dystopia.
It is, however, extremely disruptive to many many many industries.
ChatGPT showed how it could replace some lower tier writers. That’ll be true for programmers as well. This AI is evolving fast. Soon the output from the process will be indistinguishable from that of a competent writer/coder. And then it will be too far better for any human to ever catch up.

That will be the point at which we stop needing so many workers to keep things running and our population entertained. Why act out a drama when AI can write a better one and then use CGI to animate it in cartoon or live action?

When studios can use AI to pump out a new blockbuster every day, what’s the value of watching films?

When publishers can use AI to pump out new books every day, what’s the value in reading?

When software corps can use AI to write better code than any human can write, and write it so fast that instant updates are possible, what’s the point of programmers?

At least 50% of all jobs will be eliminated by AI, with the “pajama class” going first.

What will be left are the blue collar jobs.
And a lot of hungry people.
20   FortwayeAsFuckJoeBiden   2023 Mar 10, 2:51pm  

its neat, but im annoyed at chatgpt fucking lecturing me about being polite and inclusive
21   richwicks   2023 Mar 10, 2:53pm  

Shaman says


When studios can use AI to pump out a new blockbuster every day, what’s the value of watching films?


There was NEVER any value in watching films.

Shaman says

When publishers can use AI to pump out new books every day, what’s the value in reading?


There's little value in fiction at all.

Shaman says

When software corps can use AI to write better code than any human can write, and write it so fast that instant updates are possible, what’s the point of programmers?


Ask the AI to write AES256.
22   PeopleUnited   2023 Mar 10, 5:59pm  

Shaman says

When studios can use AI to pump out a new blockbuster every day, what’s the value of watching films?

Bob Newhart, Jerry Seinfeld, Chris Rock, John Williams, Hans Zimmer, Billy Graham, Billy Sunday, Bing Cosby, Lois Armstrong, Eric Clapton, Jimi Hendrix, Frank Sinatra, Ben Franklin, Thomas Jefferson... the list could go on and on. And maybe these guys get too much credit for their work, but there is simply no way AI is going to replace the abilities and contributions that human beings like you, me and them can make. The only way AI replaces humans is if the powers that be determine that they don’t need us.

AI cannot replace humans, but that doesn’t mean that the satanic globalists don’t want you to believe that you are worthless.
23   RWSGFY   2023 Mar 10, 6:09pm  

Shaman says


I think you guys are missing the point.
AI may not ever take over the world and make it into a dystopia.
It is, however, extremely disruptive to many many many industries.
ChatGPT showed how it could replace some lower tier writers. That’ll be true for programmers as well. This AI is evolving fast. Soon the output from the process will be indistinguishable from that of a competent writer/coder. And then it will be too far better for any human to ever catch up.

That will be the point at which we stop needing so many workers to keep things running and our population entertained. Why act out a drama when AI can write a better one and then use CGI to animate it in cartoon or live action?

When studios can use AI to pump out a new blockbuster every day, what’s the value of watching films?

When publishers can use AI to pump out new books every day, what’s the value in reading?

When software corps can use AI to write better code than ...


If AI is so powerful as to write better code than Linus and better plays than Shakespeare what makes you think some AI-guided robot can't replace a guy who does pretty mundane things like installing AC systems or maintaining heavy machinery? Not enough dexterity in the manipulators? I'm sure the allmighty AI can design these much better than these now-redundant engineers.... If it's as omnipotent as described above.

IF.
24   Tenpoundbass   2023 Mar 10, 6:37pm  

Shaman says

When software corps can use AI to write better code than any human can write, and write it so fast that instant updates are possible, what’s the point of programmers?

As a systems builder AI can't possibly write one off complex business requirements. By the time it "Learned" the skills required to fulfil the requirements, and the 100th roll out.
A capable programmer would have already wrapped the three month project up and put it in the can.
25   pudil   2023 Mar 10, 6:56pm  

TPB, it doesn’t need to write perfect code. It just needs to be able to do all the boilerplate and uncomplicated code. If it can do take 90% of the effort off your plate, plus give helpful suggestions on the other 10%, then I can fire 90% of my coders and qa engineers.
26   Shaman   2023 Mar 10, 8:23pm  

pudil says

TPB, it doesn’t need to write perfect code. It just needs to be able to do all the boilerplate and uncomplicated code. If it can do take 90% of the effort off your plate, plus give helpful suggestions on the other 10%, then I can fire 90% of my coders and qa engineers.


Exactly. Maybe it won’t design systems but it will do all the grunt work of implementing the design. Really really fast and really really cheap.

Richwicks if you read some fiction you’d be better prepared to understand the unforeseeable before it happens to you.
27   richwicks   2023 Mar 10, 9:06pm  

Shaman says

Richwicks if you read some fiction you’d be better prepared to understand the unforeseeable before it happens to you.


I read plenty of a fiction, as a kid. Understanding how the world works is must more useful to understand what is going to happen.

The US is self imploding, maybe purposely. Either the Neocons are just incredibly fucking stupid, or they are purposely driving this nation into a 3rd world status. We're moving into either economic fascism or communism - it doesn't make very much difference. The top of society is already fascist which is why 5 trillion dollars was created for the "covid bailout" but only 0.4 trillion (maximum) made it to individuals. Most of that money was used to purchase corporations, or at least purchase their loyalty.

We may even divide up between India/China/Russia and US/Canada/Australia/NZ/Europe. Africa and South American will just end up being contested and unaligned.
28   PeopleUnited   2023 Mar 11, 5:53am  

The world will soon be divided into 10 regions. And out of this new world order the antichrist will rise.
29   Tenpoundbass   2023 Mar 11, 8:11am  

pudil says


TPB, it doesn’t need to write perfect code. It just needs to be able to do all the boilerplate and uncomplicated code.

I've got SQL scripts that will make boiler plate code using your database table and field names and even set the correct value type.
There's Entity Frameworks that will do it as well(way too much overhead), I prefer to just script the objects I need from the tables I want.

But besides all of that, most business models can't fit in an Adventure Works Data Base and be managed by an out of box CRM or ERP.
If that was even remotely possible, 90% of the software developers over the last 30 years would not have had a job.
30   just_passing_through   2023 Mar 11, 12:11pm  

People are talking about coding AIs and chatting AIs while behind the scenes they are taking over - just maybe not the way you think they are.

Data Scientists/Engineers are growing quickly (with high pay), probably quicker than any other segment in my field.

Where I work we make widgets. For several years we've used heuristic (rules of thumb) to tell customers yes/no if we can make the custom widget they want to order.

Well, now we have several years of training data and we trained a model. This allowed us to remove most of the heuristics and to provide better responses:

1. Sure no problem this is easy - in which case we use a newer process where we can take short cuts saving both us and the customer money as well as time.
2. This is going to be hard - we use the older more expensive slower process
3. Nope, you ordered science fiction (for now)

We could not do that without the AI model.
31   Tenpoundbass   2023 Mar 11, 1:13pm  

what you described is no more earth shattering than the introduction of drag and drop RAD in software development. Where as before it was all done by hardcoding text code files.
32   Tenpoundbass   2023 Mar 11, 3:01pm  

More over when you replace one complex system or process with an even more sophisticated process. You always have to have qualified people to operate those systems.
The original intent of Biz Talk a pre Sales Force offering from Microsoft. It was touted in all of the Microsoft tech roadshows, as something that the Business principals in the enterprise that are power Excel users would be able to configure without the need for Developers or the Database report writers or admins. It was so convoluted that only 1% of the best of the very BEST Microsoft developers ever even worked on much less saw a Biz Talk implementation.

And as for AI being faster at mocking up a working program prototype software model. I would definitely race it with my tools methods and script library that I have created and acquired over the years. I can produce a working prototype in 48 hours or less. It's then going back in and bridging in the required business logic that AI isn't going to be able to do.
It has no idea that every Thursday Marge has to run a report and give it to John, who then looks for the Fred exceptions, and sends an agricultural tax report to the Tennessee Agriculture department.
33   just_passing_through   2023 Mar 11, 7:52pm  

Tenpoundbass says


More over when you replace one complex system or process with an even more sophisticated process. You always have to have qualified people to operate those systems.


This is true but still saves us and our customers time and money. It also helps us make things that we otherwise wouldn't have tried. It definitely optimizes solutions.

Tenpoundbass says


what you described is no more earth shattering


It's not, it just helps us solve biological problems, that's it.

Tenpoundbass says


I would definitely race it with my tools methods and script library that I have created and acquired over the years.


And it would beat the living shit out of you mere human. For an example, assume those widgets are custom proteins. Very complicated, scientists have worked on the problem for decades. (well just protein folding in general). Standard software and human brains never sufficed. In the end it took AI. (see alphafold). That's what you're up against.

Honestly the last several years we were just guessing and throwing shit on the wall hoping it would stick and it did often enough to get the biz through the initial growth phase.

Once we had collected that data though - well now our AI model does in fact predict what would work and what won't accurately and nearly instantly (thanks rust!). In fact there are a lot of things we are now able to make we couldn't before.

I think it's really going to help us scale the business and it can't do it any sooner as far as I'm concerned since SVB stole some of our seed money. Rat bastards!
34   just_passing_through   2023 Mar 11, 8:24pm  

I guess my point is, the advancements AI is making are in the back end too where the typical person wouldn't consider. They just think of Chat bot and burger flippers. So you may not notice how much is actually happening.
35   just_passing_through   2023 Mar 11, 8:24pm  

I guess my point is, the advancements AI is making are in the back end too where the typical person wouldn't consider. They just think of Chat bot and burger flippers. So you may not notice how much is actually happening.

It's definitely not a fad and it won't be going away barring some disaster that sets up back to the stone age.
36   richwicks   2023 Mar 11, 9:46pm  

AI creates the illusion of intelligence.

When an AI can come up with a new solution that hasn't been programmed into it, I'll be worried.

And yes, I've tested the AI on this. It cannot even produce known solutions. I asked it to produce an algorithm for SHA256SUMs, it referred me to a library and the library it referred me to was OpenSSL - a KNOWN compromised library (and it's widely used).

It's impressive in that it can mimic intelligence, but I've not seen any ability to be creative. It's very limited. It doesn't think and it cannot create. It frequently PRETENDS to know, when it doesn't. It's just a very clever bullshitting machine.
37   PeopleUnited   2023 Mar 11, 10:10pm  

just_passing_through says

Once we had collected that data though - well now our AI model does in fact predict what would work and what won't accurately and nearly instantly (thanks rust!). In fact there are a lot of things we are now able to make we couldn't before.

Building anti cancer targeted therapies?

They’ve been doing things like that for decades already. CML used to be a death sentence but now with imatinib and other TKI’s people are living out a full lifespan, not dying of cancer. And these drugs were rationally designed. AI technology just allows you to test things like that in real time but it is still a reasoning scientist looking for answers. It’s not like you can ask the computer how to kill a cancer and it spits out a pill.
38   Tenpoundbass   2023 Mar 12, 9:54am  

Just passing through, come on now, pinky swear you'll come back and man up and admit that we called it, when this fizzles and peters out.
Way too many people have left me hanging over the years, after they gave me very passionate arguments against my logic and reason regarding Hype Tech.
39   Tenpoundbass   2023 Mar 12, 9:58am  

I also suspect you're throwing new technologies into the lot of generalized AI discussions. When the main hype of AI wanes and the dust settles. There will be lots of technologies that either came out of AI, or were used to bolster AI's capabilities by calling it AI. But the crux of what is being tossed around as AI will be a thing of the past. And those other technologies will be more appropriately named. Proximity Sensors for example, when they were created, they weren't called AI, but now they are. Shit like that is propping up AI.
40   Tenpoundbass   2023 Mar 12, 1:00pm  

Here is AI Illustrated.




You see at the end of the day, you're still killing spiders with a shoe, you're just using a very elaborate way of going about it.
A complex problem like a gigantic spider, and you're using AI to place the bigger shoe over the spider. Where I would be expecting exoskeletal giant robots with laser beam eyes would be used. Perhaps I'm just more demanding than low information Tech fans.
41   just_passing_through   2023 Mar 12, 1:00pm  

PeopleUnited says

Building anti cancer targeted therapies?

I'm not going to say other than it's new and disruptive.
42   just_passing_through   2023 Mar 12, 1:00pm  

Tenpoundbass says

Just passing through, come on now, pinky swear you'll come back and man up and admit that we called it, when this fizzles and peters out.


Sure when that happens just toss this at me and I'll grant that; but it won't happen.
43   just_passing_through   2023 Mar 12, 1:03pm  

For kicks someone I work with tried to get another AI platform to write bioinformatics code for us. The performance was so bad we figure our jobs are safe for a while. The joke is that bioinformaticians made our file formats so poorly that even advanced AI can't figure them out.
44   just_passing_through   2023 Mar 12, 1:14pm  

Tenpoundbass says

I also suspect you're throwing new technologies into the lot of generalized AI discussions.


Nope as I mentioned before we have had this disruptive tech for several years now without AI. We're getting a hellava performance boost using it now though that we otherwise wouldn't have.

The AI I'm speaking about is the standard: Build a model with training data then give it new inputs to collect output predictions. We just get numbers back, not pictures of shoes or a bratty chat bot. There are a lot of these being implemented in companies now that nobody discusses because just getting numbers back isn't sexy.
45   Tenpoundbass   2023 Mar 12, 2:23pm  

just_passing_through says

There are a lot of these being implemented in companies now that nobody discusses because just getting numbers back isn't sexy.


The funny thing about numbers. When the business principal wants a report. They already know the numbers they are looking for. And they will not accept the data, until you have given a list with the numbers they are thinking. There has been times the person was wrong, and the numbers I produced based on the filter required, was the right number. People like that will blindly accept a list of pure bogus values, as long as the count aligns with what they are expecting. I could appease those people by doing a top 21 query.

So how do you know those numbers aren't what you want, vs what you need?
46   just_passing_through   2023 Mar 12, 2:25pm  

Tenpoundbass says

So how do you know those numbers aren't what you want, vs what you need?


I'm trying to stay anon so that's about all I'm going to say on the subject.
47   Tenpoundbass   2023 Mar 12, 3:13pm  

Self programming programs has been a goal for as long as I have been in software development. The only thing different between then and now, are better algorithms and processors.
There's a niche in software where the program or app is every bit of an Art as it is Science. By that I mean, takes a special talent to conceptualize, and create those programs.
While you certainly can boilerplate the mundane code through hundreds of different process, once you have established the programing pattern for those operations.
I can't see AI creating new programming patterns and concepts, with valid use cases, and working principles. It will only draw on what is available to it in that regard. As the Program flow for the AI engine would already determine it's design patterns and principles. Programmers will always be needed to update the code base to accommodate new standards and practices.

« First        Comments 15 - 47 of 47        Search these comments

Please register to comment:

api   best comments   contact   latest images   memes   one year ago   random   suggestions