This is kind of long, and I'm simplifying some things, but bear with me. I'm not convinced of Mark Cuban's prediction will come to pass, at least not in the way he's thinking of it. And it's because we've already been automating away tedious tasks and because software development isn't "just math" and can be very ambiguous. And computers are awful at dealing with ambiguity on their own.
On the first point, if you look at the history of programming languages, we've always automated tasks where we could. With early computers, you'd have to literally flip switches to instruct it to do exactly what you wanted. When modern electronic computers became a thing in the 1950s, if you wanted to write a program for it, you would write it in an assembly language. Assembly languages are basically really low level instructions for the CPU itself, just one step above the machine code, basically a series of binary values that correspond to commands the CPU understands. And when I say low level instructions, I mean things like "add the two values at these memory locations together", "move this value from x place to y place", "store this new value in memory". It's tedious. But you could get things done faster than literally toggling switches.
So not long after assembly languages became a thing, people started building programming languages that get compiled down into assembly code or machine code, with Fortran being the first "high level" programming language of this kind. Instead of worrying about writing all of that tedious assembly code to tell the computer exactly how to do something, we instead write code that's a closer representation of what we actually want to do and then have a compiler* build the machine code the computer actually understands (I'm simplifying this process for the sake of argument). And we're a lot more productive because of it. Programmers didn't lose their jobs as compiled languages became more common.You still need someone who could "tell" the compiler what to build. And figuring out "what to build" is actually where the real bulk of a software engineer's work is.
On the first point, if you look at the history of programming languages, we've always automated tasks where we could. With early computers, you'd have to literally flip switches to instruct it to do exactly what you wanted. When modern electronic computers became a thing in the 1950s, if you wanted to write a program for it, you would write it in an assembly language. Assembly languages are basically really low level instructions for the CPU itself, just one step above the machine code, basically a series of binary values that correspond to commands the CPU understands. And when I say low level instructions, I mean things like "add the two values at these memory locations together", "move this value from x place to y place", "store this new value in memory". It's tedious. But you could get things done faster than literally toggling switches.
So not long after assembly languages became a thing, people started building programming languages that get compiled down into assembly code or machine code, with Fortran being the first "high level" programming language of this kind. Instead of worrying about writing all of that tedious assembly code to tell the computer exactly how to do something, we instead write code that's a closer representation of what we actually want to do and then have a compiler* build the machine code the computer actually understands (I'm simplifying this process for the sake of argument). And we're a lot more productive because of it. Programmers didn't lose their jobs as compiled languages became more common.You still need someone who could "tell" the compiler what to build. And figuring out "what to build" is actually where the real bulk of a software engineer's work is.