Our system detected that your browser is blocking advertisements on our site. Please help support FoxesTalk by disabling any kind of ad blocker while browsing this site. Thank you.
Jump to content
Greg2607

AI - A force for Good or Evil??

Recommended Posts

On 03/11/2023 at 18:14, danny. said:

So if currently in a company say you have a system designer, and then 30 coders, project managers, scrum masters, qc testers. Let AI take a prompt from the system designer and that’s 30/31 jobs gone. AI will probably be able to design the system too so you can take the entire  software house out the picture and the client can just tell AI what they want and it comes back in minutes instead of months. 
 

I agree creative tasks are harder to replace but only to a point too as most things “creatives” do aren’t really groundbreaking but copying and merging existing bits of creative work - which AI is great at, take Midjourney for example. Even if no creatives are replaced that’s a very small fraction of many industries not being replaced. 

@dunge  

 

I understand your concern, but I honestly don't think it will be as bad as you think.   Admittedly the need for some roles will lessen, but they're not going to disappear.   Even if you've got something that creates code, you'll still need people to be able to read that code and fix it, or amend it slightly, or tune it.   ok not as many, but in IT there are always other job roles opening up as some disappear.   I've found that the more automation you build in, the more monitoring, tuning and fixing you tend to need.   Technical support rather than pure coders, but it's just a constant thing of learning new skills and being flexible.

 

Unless computers have suddenly changed whilst I wasn't looking, they fundamentally still work the same way as have done for ages.  Whether it's Unix, Windows, zSeries IBM mainframes or mobile phone operating systems, they basically work by executing machine code and performing operations on data.     You can create that machine code in a variety of ways, and computers have constantly moved to more abstraction in its generation ... whether you actually write the code yourself or have pseudo-code assembled/compiled/translated/interpreted (or whatever you want to call it) ... you still need to specify something in, to get something out.
 
Complex and sophisticated programs can certainly make a computer look "intelligent" of "clever".   But it's not.    Computers are quite stupid.   They basically just do what you tell them, they do it zillions of times over and over again without getting bored, and they do it very quickly.    But that's it.   They are not organic.  They don't have brains.  They can't think for themselves.   The most "clever" computer systems have been designed and programmed by "clever" people.     As you might be able to tell, I haven't quite bought into all the "AI" hype.
 
So much has changed in terms of increased abstraction (writing pseudo code, instead of 3GL languages), processing power, memory, disk capacity, parallelism, buffering data in cache as opposed to physical I/O reads, non-procedural Relational databases rather than procedural Hierarchical and Network databases, etc etc.   But the underlying principles are basically the same.  Arithmetic, logic and processing instructions (on data).
 
Even if you've got something that can generate all the relevant machine code in a few seconds, you still need to specify the parameters and business rules for that code.   You'll need business analysts to create the specification, system designers to plan the physical environments, testers to verify the generated code, project planners to liaise with all relevant parties, implementation and release planning personnel, performance specialists to monitor and tune the less than optimal-generated code, server/network support teams to maintain the environments, engineers to plug in and install the hardware, etc etc.   
 
Every advance in computing was partially sold on reducing the headcount needed to support it.    This has probably been true with respect to the pure programming/coding roles, but every other side within IT has needed lots of people to support it.  Computers don't jump up into racks and run themselves.
 
Take a simple example.  Get so-called "AI" to create a picture of a cat.
 
What type of cat? fur colour? eye colour? tail size? ear shape and size?
Is it wearing a colllar? If so, what colour? material? size? does it have a bell on it? If so, what shape bell? size? colour?
Is picture taken indoors or outdoors?  If indoors, on a carpet? wooden floor? tiles?   which room is it in?  what's the background?
If outdoors, in the garden? street? on top of your car? up a tree? chasing a mouse?  What's the background weather doing?   etc etc
I probably haven't covered 10% of the parameters you would need to specify ... and specify them somewhere you need to do, as computers can't (yet) read our minds.    And that's just a simple/trivial example I can think of.   A full functioning business system is a million times more complicated than this, and it will still need planning, spec'ing, testing, implementing, monitoring, tuning, changing, fixing, and other things I've forgotten.

 

  • Like 2
Link to comment
Share on other sites

On 04/11/2023 at 20:16, Dunge said:

Regarding Crypto, I don’t see that it’s like gold. Gold is a commodity, it has an inherent value and use to people. Its inert nature, relative rarity and desirability makes it an object of value across cultures. If someone tomorrow decided the value of gold was zero, it would come back up again precisely because it’s something tangible to represent value. Crypto represents value but it isn’t tangible. If it was decided tomorrow that a given Crypto currency had a value of zero then there’s no reason it should recover because it has no inherent value beyond generally accepted confidence.

 

What that link is describing, for me, is more of a hybrid - looking to utilise some of the technological developments that have come from Crypto, without necessarily using Crypto currencies themselves. But the fact remains that if the banking system builds a reliance on Crypto without having control over it then we’ll just end up with another banking crisis like the late 2000’s where the banks, regulators and populace all allowed a system that they didn’t understand to get out of control. Combine that with AI and all I can see is an accident waiting to happen unless they properly limit it.

I dunno I think its here to stay and will conitnue to grow. https://watcher.guru/news/crypto-uks-bank-of-england-releases-stablecoin-regulation-plan

Link to comment
Share on other sites

4 hours ago, worth_the_wait said:

@dunge  

 

I understand your concern, but I honestly don't think it will be as bad as you think.   Admittedly the need for some roles will lessen, but they're not going to disappear.   Even if you've got something that creates code, you'll still need people to be able to read that code and fix it, or amend it slightly, or tune it.   ok not as many, but in IT there are always other job roles opening up as some disappear.   I've found that the more automation you build in, the more monitoring, tuning and fixing you tend to need.   Technical support rather than pure coders, but it's just a constant thing of learning new skills and being flexible.

 

Unless computers have suddenly changed whilst I wasn't looking, they fundamentally still work the same way as have done for ages.  Whether it's Unix, Windows, zSeries IBM mainframes or mobile phone operating systems, they basically work by executing machine code and performing operations on data.     You can create that machine code in a variety of ways, and computers have constantly moved to more abstraction in its generation ... whether you actually write the code yourself or have pseudo-code assembled/compiled/translated/interpreted (or whatever you want to call it) ... you still need to specify something in, to get something out.
 
Complex and sophisticated programs can certainly make a computer look "intelligent" of "clever".   But it's not.    Computers are quite stupid.   They basically just do what you tell them, they do it zillions of times over and over again without getting bored, and they do it very quickly.    But that's it.   They are not organic.  They don't have brains.  They can't think for themselves.   The most "clever" computer systems have been designed and programmed by "clever" people.     As you might be able to tell, I haven't quite bought into all the "AI" hype.
 
So much has changed in terms of increased abstraction (writing pseudo code, instead of 3GL languages), processing power, memory, disk capacity, parallelism, buffering data in cache as opposed to physical I/O reads, non-procedural Relational databases rather than procedural Hierarchical and Network databases, etc etc.   But the underlying principles are basically the same.  Arithmetic, logic and processing instructions (on data).
 
Even if you've got something that can generate all the relevant machine code in a few seconds, you still need to specify the parameters and business rules for that code.   You'll need business analysts to create the specification, system designers to plan the physical environments, testers to verify the generated code, project planners to liaise with all relevant parties, implementation and release planning personnel, performance specialists to monitor and tune the less than optimal-generated code, server/network support teams to maintain the environments, engineers to plug in and install the hardware, etc etc.   
 
Every advance in computing was partially sold on reducing the headcount needed to support it.    This has probably been true with respect to the pure programming/coding roles, but every other side within IT has needed lots of people to support it.  Computers don't jump up into racks and run themselves.
 
Take a simple example.  Get so-called "AI" to create a picture of a cat.
 
What type of cat? fur colour? eye colour? tail size? ear shape and size?
Is it wearing a colllar? If so, what colour? material? size? does it have a bell on it? If so, what shape bell? size? colour?
Is picture taken indoors or outdoors?  If indoors, on a carpet? wooden floor? tiles?   which room is it in?  what's the background?
If outdoors, in the garden? street? on top of your car? up a tree? chasing a mouse?  What's the background weather doing?   etc etc
I probably haven't covered 10% of the parameters you would need to specify ... and specify them somewhere you need to do, as computers can't (yet) read our minds.    And that's just a simple/trivial example I can think of.   A full functioning business system is a million times more complicated than this, and it will still need planning, spec'ing, testing, implementing, monitoring, tuning, changing, fixing, and other things I've forgotten.

 

I agree completely.

Link to comment
Share on other sites

3 hours ago, whoareyaaa said:

I dunno I think its here to stay and will conitnue to grow. https://watcher.guru/news/crypto-uks-bank-of-england-releases-stablecoin-regulation-plan

This feels like a con of a con!

 

The key phrase to me in that article is “backed by the British pound”, which is the whole thing Crypto was set up to avoid in the first place. It’s like trying to get people into Crypto while actually linking it to a nation state currency, and trying to control of an anarchic market.

 

Personally I hope they’re able to take that control rather than getting nation state currencies dragged into the abyss. Although another possibility is that they’re found to be incompatible. Still, I’m sure Rishi Sunak wouldn’t waste money like that…

Link to comment
Share on other sites

7 hours ago, worth_the_wait said:

@dunge  

 

I understand your concern, but I honestly don't think it will be as bad as you think.   Admittedly the need for some roles will lessen, but they're not going to disappear.   Even if you've got something that creates code, you'll still need people to be able to read that code and fix it, or amend it slightly, or tune it.   ok not as many, but in IT there are always other job roles opening up as some disappear.   I've found that the more automation you build in, the more monitoring, tuning and fixing you tend to need.   Technical support rather than pure coders, but it's just a constant thing of learning new skills and being flexible.

 

Unless computers have suddenly changed whilst I wasn't looking, they fundamentally still work the same way as have done for ages.  Whether it's Unix, Windows, zSeries IBM mainframes or mobile phone operating systems, they basically work by executing machine code and performing operations on data.     You can create that machine code in a variety of ways, and computers have constantly moved to more abstraction in its generation ... whether you actually write the code yourself or have pseudo-code assembled/compiled/translated/interpreted (or whatever you want to call it) ... you still need to specify something in, to get something out.
 
Complex and sophisticated programs can certainly make a computer look "intelligent" of "clever".   But it's not.    Computers are quite stupid.   They basically just do what you tell them, they do it zillions of times over and over again without getting bored, and they do it very quickly.    But that's it.   They are not organic.  They don't have brains.  They can't think for themselves.   The most "clever" computer systems have been designed and programmed by "clever" people.     As you might be able to tell, I haven't quite bought into all the "AI" hype.
 
So much has changed in terms of increased abstraction (writing pseudo code, instead of 3GL languages), processing power, memory, disk capacity, parallelism, buffering data in cache as opposed to physical I/O reads, non-procedural Relational databases rather than procedural Hierarchical and Network databases, etc etc.   But the underlying principles are basically the same.  Arithmetic, logic and processing instructions (on data).
 
Even if you've got something that can generate all the relevant machine code in a few seconds, you still need to specify the parameters and business rules for that code.   You'll need business analysts to create the specification, system designers to plan the physical environments, testers to verify the generated code, project planners to liaise with all relevant parties, implementation and release planning personnel, performance specialists to monitor and tune the less than optimal-generated code, server/network support teams to maintain the environments, engineers to plug in and install the hardware, etc etc.   
 
Every advance in computing was partially sold on reducing the headcount needed to support it.    This has probably been true with respect to the pure programming/coding roles, but every other side within IT has needed lots of people to support it.  Computers don't jump up into racks and run themselves.
 
Take a simple example.  Get so-called "AI" to create a picture of a cat.
 
What type of cat? fur colour? eye colour? tail size? ear shape and size?
Is it wearing a colllar? If so, what colour? material? size? does it have a bell on it? If so, what shape bell? size? colour?
Is picture taken indoors or outdoors?  If indoors, on a carpet? wooden floor? tiles?   which room is it in?  what's the background?
If outdoors, in the garden? street? on top of your car? up a tree? chasing a mouse?  What's the background weather doing?   etc etc
I probably haven't covered 10% of the parameters you would need to specify ... and specify them somewhere you need to do, as computers can't (yet) read our minds.    And that's just a simple/trivial example I can think of.   A full functioning business system is a million times more complicated than this, and it will still need planning, spec'ing, testing, implementing, monitoring, tuning, changing, fixing, and other things I've forgotten.

 

You’re arguing against your own argument. So you need to give a detailed prompt to Midjourney which might take 45 seconds and can be performed by anyone with a basic grasp of English and zero artistist talent. This will save 3 weeks of work that would otherwise need to be carried out by a commissioned artist, and somehow that won’t put the artist out of a job. 
 

The people specifying systems and writing briefs and scopes aren’t the same roles as the people currently implementing the work, so again I’m not sure how you think these industries won’t be decimated.

Edited by danny.
Link to comment
Share on other sites

1 hour ago, danny. said:

You’re arguing against your own argument. So you need to give a detailed prompt to Midjourney which might take 45 seconds and can be performed by anyone with a basic grasp of English and zero artistist talent. This will save 3 weeks of work that would otherwise need to be carried out by a commissioned artist, and somehow that won’t put the artist out of a job. 
 

The people specifying systems and writing briefs and scopes aren’t the same roles as the people currently implementing the work, so again I’m not sure how you think these industries won’t be decimated.

They 100% will be decimated IMO but i could be wrong.

 

Tech support roles  and other IT roles will pay less as sooooo many candidates will be fighting for those roles.

 

It will be interesting to see what roles are carved put of automation where it cant do certain things effectively (until AI gets smarter and figures that out lol).

 

Think i should get into making porn, nothing beats real meat.

  • Like 1
Link to comment
Share on other sites

7 hours ago, Jattdogg said:

They 100% will be decimated IMO but i could be wrong.

 

Tech support roles  and other IT roles will pay less as sooooo many candidates will be fighting for those roles.

 

It will be interesting to see what roles are carved put of automation where it cant do certain things effectively (until AI gets smarter and figures that out lol).

 

Think i should get into making porn, nothing beats real meat.

And this is what people aren’t getting. Yes, some things AI is bad at, or just can’t do right now. And because we’ve previously seen it take decades for traditional computer software to advance people think AI advances are decades away. AI models get massively better with each version of a system/model, and those versions come out several times a year. Look at the leaps in say ChatGPT or Midjourney from just a year ago. 

  • Like 1
Link to comment
Share on other sites

16 hours ago, danny. said:

You’re arguing against your own argument. So you need to give a detailed prompt to Midjourney which might take 45 seconds and can be performed by anyone with a basic grasp of English and zero artistist talent. This will save 3 weeks of work that would otherwise need to be carried out by a commissioned artist, and somehow that won’t put the artist out of a job. 
 

The people specifying systems and writing briefs and scopes aren’t the same roles as the people currently implementing the work, so again I’m not sure how you think these industries won’t be decimated.

Not really.     I was just pointing out how I don't think so-called "AI" will decimate the IT industry.   Some roles won't need as many people, but there will still be plenty of others.  

Link to comment
Share on other sites

1 hour ago, worth_the_wait said:

Not really.     I was just pointing out how I don't think so-called "AI" will decimate the IT industry.   Some roles won't need as many people, but there will still be plenty of others.  

Well I hope you’re right, but experts in the field don’t share your optimism. I wouldn’t describe myself as an expert but I use and follow AI quite closely and have done for a while and also can’t see where you’re getting that viewpoint from. 

Link to comment
Share on other sites

5 hours ago, danny. said:

Well I hope you’re right, but experts in the field don’t share your optimism. I wouldn’t describe myself as an expert but I use and follow AI quite closely and have done for a while and also can’t see where you’re getting that viewpoint from. 

To be honest, I don’t understand this. You mention experts - the first page you find on Google when searching for opinions on AI and its potential threat to jobs is filled with experts and commentators basically saying what worth_the_wait is saying.

 

And that’s before we get into the massive problems AI is going to hit with plagiarism and copyright law.

Link to comment
Share on other sites

On 07/11/2023 at 01:46, Jattdogg said:

They 100% will be decimated IMO but i could be wrong.

 

Tech support roles  and other IT roles will pay less as sooooo many candidates will be fighting for those roles.

 

It will be interesting to see what roles are carved put of automation where it cant do certain things effectively (until AI gets smarter and figures that out lol).

 

Think i should get into making porn, nothing beats real meat.

AI porn will anticipate your every desire.

 

AI porn will be hooked up to ai gadgetry.

 

AI porn will be better than 2D porn.

  • Haha 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...