> Agents will allow human programmers to get what they've been begging for decades now: proper requirements and flexible, logical, tooling.
...and once this goal is finally reached the programmer will breathe a sigh of relief and then promptly be fired since now the machine can do the job as well as they could.
Let's see if even mid/big companies with tons of resources, with AI and the right tooling will continue to write webview-apps or, even worse, use some kind of multi target wrapper.
>Google collects usage data for the Android CLI, such as commands, sub-commands, and flags used. This data does not include custom parameters or identifiable information. This information helps improve the tool and is collected in accordance with Google's Privacy Policy.
Man, we are really entering into the next level of hell with this LLM business. I predict a boom in the hacking industry as people get thoroughly owned by all the resulting giant security flaws in the public facing dumpster fires people are gleefully shoving out the door.
Just wait until there are entire classes of vulnerabilities related to LLM usage, such as malignant patterns resulting from flaws in the training data or algorithms which result in generating predictably bad code at a statistically significant rate that can be known and exploited. In other words, the LLM subtly adds in the security flaws by accident. Or by "accident."
Some people in the future are going to make big bucks cleaning up all this garbage. Just think of the situation now when trying to migrate some company off an ancient tech stack that is core to its business, and all the software and hardware is obsolete and undocumented--except now it'll also be autogenerated by a 2026 LLM. LOL! It'll be highly paid for a reason, because the job is a waking nightmare. Real soul destroying stuff.
> Just wait until there are entire classes of vulnerabilities related to LLM usage
The sort of issue that is to be discovered and widely exploited will be a new class of vulnerabilities which an LLM is involved and makes it possible to cause catastrophic damage to a company.
This won't be surprising since we have companies building casual remote code execution tools for "agents" waiting to be hijacked.
Realistically it will be a giant hodgepodge of code that was steadily glommed onto by various versions of 2026, 2027, 2028, ... 2032 etc LLMs over a decade or more of increasingly convoluted and unhomogeneous "progress" by a variety of programmers of various level of "talent" and understanding......well, you can picture the rest. So just the next level of hell.
Agents will allow human programmers to get what they've been begging for decades now: proper requirements and flexible, logical, tooling.
> Agents will allow human programmers to get what they've been begging for decades now: proper requirements and flexible, logical, tooling.
...and once this goal is finally reached the programmer will breathe a sigh of relief and then promptly be fired since now the machine can do the job as well as they could.
Let's see if even mid/big companies with tons of resources, with AI and the right tooling will continue to write webview-apps or, even worse, use some kind of multi target wrapper.
>Google collects usage data for the Android CLI, such as commands, sub-commands, and flags used. This data does not include custom parameters or identifiable information. This information helps improve the tool and is collected in accordance with Google's Privacy Policy.
>https://policies.google.com/privacy
>Disable Android CLI metrics collection by using the --no-metrics flag.
No thanks, is there no env variable for this? Doesn't Google have enough data already?
Android CLI can write a tool that wraps android-cli and automatically passes the flag based on an env variable.
How would Google have enough data about a brand new product without collecting that data?
Now please let us install the apps just as easily
downloading an APK and opening it is already about as easy as it gets. the only thing easier would be for someone else to do it for you
But can I publish an app without having to share my ID? If not, I don't want it.
Absolutely not. That would be crazy.
Man, we are really entering into the next level of hell with this LLM business. I predict a boom in the hacking industry as people get thoroughly owned by all the resulting giant security flaws in the public facing dumpster fires people are gleefully shoving out the door.
Just wait until there are entire classes of vulnerabilities related to LLM usage, such as malignant patterns resulting from flaws in the training data or algorithms which result in generating predictably bad code at a statistically significant rate that can be known and exploited. In other words, the LLM subtly adds in the security flaws by accident. Or by "accident."
Some people in the future are going to make big bucks cleaning up all this garbage. Just think of the situation now when trying to migrate some company off an ancient tech stack that is core to its business, and all the software and hardware is obsolete and undocumented--except now it'll also be autogenerated by a 2026 LLM. LOL! It'll be highly paid for a reason, because the job is a waking nightmare. Real soul destroying stuff.
What does this have to do with the Android CLI?
Do I need to spell it out for you?
Since rafram is not the only one confused, yes, you really do.
It isn't that hard to understand:
> Just wait until there are entire classes of vulnerabilities related to LLM usage
The sort of issue that is to be discovered and widely exploited will be a new class of vulnerabilities which an LLM is involved and makes it possible to cause catastrophic damage to a company.
This won't be surprising since we have companies building casual remote code execution tools for "agents" waiting to be hijacked.
That probably depends on how good 2026-era LLMs already are. But I hope you’re right, and that pre-AI devs will still make a real difference.
Realistically it will be a giant hodgepodge of code that was steadily glommed onto by various versions of 2026, 2027, 2028, ... 2032 etc LLMs over a decade or more of increasingly convoluted and unhomogeneous "progress" by a variety of programmers of various level of "talent" and understanding......well, you can picture the rest. So just the next level of hell.