Programming

Salesforce Executive Shares 'Four Ways Coders Can Fight the Climate Crisis' (forbes.com) 79


Saleforce's chief impact officer, writing in Forbes: Code and computer programming — the backbone of modern business — has a long way to go before it can be called "green..." According to a recent report from the science journal Patterns, the information and communication technology sector accounts for up to 3.9% of global emissions... So far, the focus has been on reducing energy consumption in data centers and moving electrical grids away from fossil fuels. Now, coders and designers are ready for a similar push in software, crypto proof of work and AI compute power...

Our research revealed that 75% of UX designers, software developers and IT operations managers want software to do less damage to the environment. Yet nearly one in two don't know how to take action. Half of these technologists admit to not knowing how to mitigate environmental harm in their work, leading to 34% acknowledging that they "rarely or never" consider carbon emissions while typing a new line of code... Earlier this year, Salesforce launched a sustainability guide for technology that provides practical recommendations for aligning climate goals with software development.

In the article the Salesforce executive makes four recommendations, urging coders to design sites in ways that reduce the energy needed to display them. ("Even small changes to image size, color and type options can scale to large impacts.") They also recommend writing application code that uses less energy, which "can lead to significant emissions reductions, particularly when deployed at scale. Leaders can seek out apps that are coded to run natively in browsers which can lead to improvement in performance and a reduction in energy use."

Their article includes links to the energy-saving hackathon GreenHack and the non-profit Green Software Foundation. (Their site recently described how the IT company AVEVA used a Raspberry Pi in back of a hardware cluster as part of a system to measure software's energy consumption.)

But their first recommendation for fighting the climate crisis is "Adopt new technology like AI" to "make the software development cycle more energy efficient." ("At Salesforce, we're starting to see tremendous potential in using generative AI to optimize code and are excited to release this to customers in the future.")
Government

US Senate Panel Passes AM Radio, Ticket Fee Pricing Bills (reuters.com) 264

An anonymous reader quotes a report from Reuters: The U.S. Senate Commerce Committee approved legislation on Thursday to bar automakers from eliminating AM broadcast radio in new vehicles and require companies like Ticketmaster to put total ticket prices including fees in marketing materials. The AM radio bill and the ticket-pricing bill both had strong bipartisan support and both have companion measures in the House of Representatives. The AM radio bill would direct the Transportation Department to issue regulations mandating AM radio in new vehicles without additional charge. Senators said this year that at least seven automakers have removed AM broadcast radio from their electric vehicles, including Tesla, BMW, and Volkswagen. Ford reversed course in May under pressure from Congress. Lawmakers say losing AM radio undermines a federal system for delivering key public safety information to the public. The National Association of Broadcasters said the bill "will ensure that the tens of millions of AM radio listeners across the country retain access to local news, diverse community programming and emergency information." The Alliance for Automotive Innovation, a trade group representing major automakers, opposed the measure: "This is simply a bill to prop up and give preference to a particular technology that's now competing with other communications options and adapting to changing listenership."

The U.S. Senate Commerce Committee also approved two bills aimed at tightening privacy protections for children online.
Movies

Code.org Embraces Barbie 9 Years After Helping Take Her Down (tynker.com) 75

Long-time Slashdot reader theodp writes: The number one movie in North America is Warner Bros. Discovery's Barbie, which Deadline reports has teamed up with Oppenheimer to fuel a mind-blowing $300M+ box office weekend. ["Oppenheimer Shatters Expectations with $80 Million Debut," read the headline at Variety.]

Now it seems everybody is trying to tap into Barbie buzz, including Microsoft's Xbox [which added Barbie and Ken's cars to Forza Horizon 5] and even Microsoft-backed education nonprofit Code.org. ("Are your students excited about Barbie The Movie? Have them try an HourOfCode [programming game] with Barbie herself!").

The idea is to inspire young students to become coders. But as Code.org shares Instagram images of a software developer Barbie, Slashdot reader theodp remembers when, nine years ago, Code.org's CEO "took to Twitter to blast Barbie and urge for her replacement." They'd joined a viral 2014 Computer Engineer Barbie protest that arose in response to the publication of Barbie F***s It Up Again, a scathing and widely reported-on blog post that prompted Mattel to pull the book Barbie: I Can Be a Computer Engineer immediately from Amazon. This may have helped lead to Barbie's loss of her crown as the most popular girls' toy in the ensuing 2014 holiday season to Disney's Frozen princesses Elsa and Anna, and got the Mattel exec who had to apologize for Computer Engineer Barbie called to the White House for a sit down a few months later. (Barbie got a brainy makeover soon thereafter)...

The following year, Disney-owned Lucasfilm and Code.org teamed up on Star Wars: Building a Galaxy with Code, a signature tutorial for the 2015 Hour of Code. Returning to a Disney princess theme in 2016, Disney and Code.org revealed a new Hour of Code tutorial featuring characters from the animated film Moana just a day ahead of its theatrical release. It was later noted that Moana's screenwriters included Pamela Ribon, who penned the 2014 Barbie-blasting blog post that ended Barbie's short reign as the Hour of Code role model of choice for girls.

Interestingly, Ribon seems to bear no Barbie grudges either, tweeting on the day of the Barbie movie release, "I was like holy s*** can't wait to see it."

To be fair, the movie's trailer promises "If you hate Barbie, this movie is for you," in a deconstruction where Barbie is played by D.C. movies' "Harley Quinn" actress Margot Robbie (Suicide Squad, Birds of Prey), whose other roles include Tonya Harding and the home-wrecking second wife in The Wolf of Wall Street.
AI

Ask Slashdot: What Happens After Every Programmer is Using AI? (infoworld.com) 127

There's been several articles on how programmers can adapt to writing code with AI. But presumably AI companies will then gather more data from how real-world programmers use their tools.

So long-time Slashdot reader ThePub2000 has a question. "Where's the generative leaps if the humans using it as an assistant don't make leaps forward in a public space?" Let's posit a couple of things:

- First, your AI responses are good enough to use.
- Second, because they're good enough to use you no longer need to post publicly about programming questions.

Where does AI go after it's "perfected itself"?

Or, must we live in a dystopian world where code is scrapable for free, regardless of license, but access to support in an AI from that code comes at a price?

Programming

Is C++ Gaining in Popularity? (i-programmer.info) 106

An anonymous reader shares this report from Dice.com: C++ is enjoying a surge in popularity, according to the latest update to the TIOBE Index, which tracks programming languages' "buzz."

C++ currently sits right behind C and Python on TIOBE's list. "A few months ago, the programming C++ language claimed position 3 of the TIOBE index (at the expense of Java). But C++ has not finished its rise. C seems to be its next victim," added the note accompanying the data... ["At the moment, the gap between the two is only 0.76%."]

Matlab, Scratch and Rust also match their all time high records at respectively positions #10, #12 and #17.

So here, according to TIOBE, are the 10 most popular programmings languages:

1. Python
2. C
3. C++
4. Java
5. C#
6. JavaScript
7. Visual Basic
8. SQL
9. PHP
10. MATLAB

The site I Programmer digs deeper: C++ was the only one of the top four languages to see a positive year-on-year change in its percentage rating — adding 0.79% to stand at 10.8%. Python had the smallest loss of the entire Top 20, -0.01% leaving it with a share of 13,42% while Visual Basic had the greatest loss at -2.07%. This, combined with JavaScript gaining 1.34%, led to JavaScript overtaking it to occupy #6, its highest ever ranking in the TIOBE Index.
They also note that COBOL "had a 3-month rise going from a share of 0.41% in April to 0.86% in July which moved it into #20 on the index."
Supercomputing

Cerebras To Enable 'Condor Galaxy' Network of AI Supercomputers 20

Cerebras Systems and G42 have introduced the Condor Galaxy project, a network of nine interconnected supercomputers designed for AI model training with a combined performance of 36 FP16 ExaFLOPs. The first supercomputer, CG-1, located in California, offers 4 ExaFLOPs of FP16 performance and 54 million cores, focusing on Large Language Models and Generative AI without the need for complex distributed programming languages. AnandTech reports: CG-2 and CG-3 will be located in the U.S. and will follow in 2024. The remaining systems will be located across the globe and the total cost of the project will be over $900 million. The CG-1 supercomputer, situated in Santa Clara, California, combines 64 Cerebras CS-2 systems into a single user-friendly AI supercomputer, capable of providing 4 ExaFLOPs of dense, systolic FP16 compute for AI training. Based around Cerebras's 2.6 trillion transistor second-generation wafer scale engine processors, the machine is designed specifically for Large Language Models and Generative AI. It supports up to 600 billion parameter models, with configurations that can be expanded to support up to 100 trillion parameter models. Its 54 million AI-optimized compute cores and massivefabric network bandwidth of 388 Tb/s allow for nearly linear performance scaling from 1 to 64 CS-2 systems, according to Cerebras. The CG-1 supercomputer also offers inherent support for long sequence length training (up to 50,000 tokens) and does not require any complex distributed programming languages, which is common in case of GPU clusters.

This supercomputer is provided as a cloud service by Cerebras and G42 and since it is located in the U.S., Cerebras and G42 assert that it will not be used by hostile states. CG-1 is the first of three 4 FP16 ExaFLOP AI supercomputers (CG-1, CG-2, and CG-3) created by Cerebras and G42 in collaboration and located in the U.S. Once connected, these three AI supercomputers will form a 12 FP16 ExaFLOP, 162 million core distributed AI supercomputer, though it remains to be seen how efficient this network will be. In 2024, G42 and Cerebras plan to launch six additional Condor Galaxy supercomputers across the world, which will increase the total compute power to 36 FP16 ExaFLOPs delivered by 576 CS-2 systems. The Condor Galaxy project aims to democratize AI by offering sophisticated AI compute technology in the cloud.
"Delivering 4 exaFLOPs of AI compute at FP16, CG-1 dramatically reduces AI training timelines while eliminating the pain of distributed compute," said Andrew Feldman, CEO of Cerebras Systems. "Many cloud companies have announced massive GPU clusters that cost billions of dollars to build, but that are extremely difficult to use. Distributing a single model over thousands of tiny GPUs takes months of time from dozens of people with rare expertise. CG-1 eliminates this challenge. Setting up a generative AI model takes minutes, not months and can be done by a single person. CG-1 is the first of three 4 ExaFLOP AI supercomputers to be deployed across the U.S. Over the next year, together with G42, we plan to expand this deployment and stand up a staggering 36 exaFLOPs of efficient, purpose-built AI compute."
Programming

Most Outsourced Coders In India Will Be Gone In 2 Years Due To AI, Stability AI Boss Predicts (cnbc.com) 85

Most outsourced programmers in India will see their jobs wiped out in the next year or two, Stability AI CEO Emad Mostaque said. CNBC reports: Mostaque, on a call with UBS analysts, said that most of the country's outsourced coders will lose their jobs as the effects of AI mean that it is now possible for software to be developed with far fewer people. "I think that it affects different types of jobs in different ways," Mostaque said on a call with analysts at the Swiss investment bank last week. "If you're doing a job in front of a computer, and no one ever sees you, then it's massively impactful, because these models are like really talented grads."

According to Mostaque, not everyone will be affected in the same way, however. That is due in no small part to differing rules and regulations around the world. Countries with stronger labor laws, like France, will be less likely to see such an impact, for example. In India, Mostaque said, "outsourced coders up to level three programmers will be gone in the next year or two, whereas in France, you'll never fire a developer." "So it affects different models in different countries in different ways in different sectors."

Mostaque reiterated a previous statement he made saying that there will be "no more programmers" in five years' time -- however, he caveated this to say that he meant coders in the traditional sense. "Why would you have to write code where the computer can write code better? When you deconstruct the programming thing from bug testing to unit testing to ideation, an AI can do that, just better," Mostaque said. "But it won't be doing it automatically, it will be AI 'co-pilots,'" Mostaque said. "That means less people are needed for classical programming, but then are they needed for other things? This is the question and this is the balance that we have to understand, because different areas are also affected differently."

Music

Plex's Winamp-inspired Music Player Plexamp is Now Free (techcrunch.com) 46

Plexamp, the music player originally incubated by the Labs division of media company Plex, is now free, the company announced today. From a report: The project was first launched in 2017 as Plex's own spin on the classic Winamp media player app, offering visualizations to accompany your tunes, tools for programming mixes, and more recently, a ChatGPT-powered "Sonic Sage" feature that builds unique playlists from users' music libraries. However, after its expansion from desktop to mobile, Plexamp was only available to subscribers.

Now, Plex says the Plexamp app will become free, allowing users to play tracks from their own library or the TIDAL music streaming service with high-quality audio and support for lossless audio. The app also includes gapless playback, loudness leveling, and smooth transitions between tracks, among other things. In addition to Library Radio, a feature used to rediscover your music, users can create playlists with Plexamp to match their current mood: like "brooding, cathartic, confident, intense, playful, poignant, swaggering, and wistful," the company says.

AI

Anthropic Releases a New Version of Its ChatGPT Rival, Claude (bloomberg.com) 23

Anthropic, an artificial intelligence startup positioning itself as the builder of a safer kind of chatbot, has released a new version of its AI bot, named Claude. From a report: Anthropic said that Claude 2 is available to anyone in the US or UK online at claude.ai, and businesses can access it via an application programming interface. The new release on Tuesday comes several months after Anthropic began offering an earlier version of Claude to businesses that wanted to add it to their products. Previously, the bot was tested by a handful of companies including Quora, which built it into an app called Poe that lets users ask questions.

Like its predecessor, Claude 2 is built atop a large language model and can be used for written tasks like summarizing, searching, answering questions and coding. Both models can currently take in large chunks of text -- a user can ask it to summarize a book, for instance -- though Claude 2 can generate longer responses than its predecessor. Responses can reach up to about 3,000 words, according to data provided by the company. Claude 2 will also offer more accurate responses on some topics, such as coding and grade-school-level math, the company said. Anthropic's goal has been for Claude to be less susceptible than other chatbots to manipulation.

Anime

Hulu Launches Adult Animation, Anime Hub Animayhem (variety.com) 24

According to Variety, Hulu is launching a new sub-brand focused on adult animation and anime called Animayhem. From the report: The new hub is meant to capitalize on Hulu's already popular lineup of adult animation and anime shows. Series like "American Dad," "Bob's Burgers," "Family Guy," "Futurama" and "King of the Hill" consistently rank among the service's top 10 shows based on hours streamed, per Hulu. So far this year, over one billion hours of adult animation content has been streamed on Hulu, along with over 288 million hours of anime content, the streamer claims.

Hulu currently has 46 adult animated series, which adds up to 174 seasons and 2,600 episodes. For anime, Hulu has 17 films and 272 series, adding up to 435 seasons and 18,400 episodes (including subtitled and English dubbed versions of episodes). As part of the launch, Hulu is debuting a range of ads for Animayhem, which they describe as the "Animation Destination," one of which can be seen below.
"When you have the number one offering in adult animation and anime of any major streaming service, creating this destination is obvious. We know exactly where we can meet these fans, because they're already here," said Barrie Gruner, Hulu's executive vice president of marketing and publicity, in an interview with Variety.

"I would say that this brand really cements Hulu as the ultimate streaming destination for animation and we're not going to achieve that with single title campaigns," Gruner said. "This is truly an intersection with our original programming and our library."

Animayhem will also be coming to San Diego Comic-Con via an immersive experience dubbed "Hulu Animayhem: Into the Second Dimension."
Programming

Why Are There So Many Programming Languages? (acm.org) 160

Long-time Slashdot reader theodp writes: Recalling a past Computer History Museum look at the evolution of programming languages, Doug Meil ponders the age-old question of Why Are There So Many Programming Languages? in a new Communications of the ACM blog post.

"It's worth noting and admiring the audacity of PL/I (1964)," Meil writes, "which was aiming to be that 'one good programming language.' The name says it all: Programming Language 1. There should be no need for 2, 3, or 4. [Meil expands on this thought in Lessons from PL/I: A Most Ambitious Programming Language.] Though PL/I's plans of becoming the Highlander of computer programming didn't play out like the designers intended, they were still pulling on a key thread in software: why so many languages? That question was already being asked as far back as the early 1960's."

One of PL/I's biggest fans was Digital Research Inc. (DRI) founder Gary Kildall, who crafted the PL/I-inspired PL/M (Programming Language for Microcomputers) in 1973 for Intel. But IBM priced PL/I higher than the languages it sought to replace, contributing to PL/I's failure to gain traction. (Along the lines of how IBM's deal with Microsoft gave rise to a price disparity that was the undoing of Kildall's CP/M OS, bundled with every PC in a 'non-royalty' deal. Windows was priced at $40 while CP/M was offered 'a la carte' at $240.) As a comp.lang.pl1 poster explained in 2006, "The truth of the matter is that Gresham's Law: 'Bad money drives out good' or Ruskin's principle: 'The hoi polloi always prefer an inferior, cheap product over a superior, more expensive one' are what govern here."

Perl

Perl 5.38 Released with New Experimental Syntax for Defining Object Classes (phoronix.com) 48

Perl 5.38 was released this week "after being in development for more than one year," reports Phoronix. "Perl 5.38 brings a new experimental syntax for defining object classes where per-instance data is stored in 'field' variables that behave like lexicals."

"Maybe, just maybe, the new features introduced into the language in this newest version will attract much sought new talent," writes the site I Programmer, noting the argument that Perl is installed by default everywhere — and has the "fun factor... The class keyword is part of the plan to bring effective object-oriented programming to the Perl core while still keeping Perl being Perl."

The Perl docs warn that "This remains a new and experimental feature, and is very much still under development. It will be the subject of much further addition, refinement and alteration in future releases." But "Since Perl 5, support for objects revolved around the concept of blessing references with a package name," notes updated documentation, which points out this new class syntax "isn't a bless wrapper, but a completely new system built right into the perl interpreter." The class keyword declares a new package which is intended to be a class... classes automatically get a constructor named new... Just like with other references, when object reference count reaches zero it will automatically be destroyed.
Phoronx notes that Perl 5.38 also brings a new PERL_RAND_SEED environment variable "for controlling seed behavior for random number generation," along with some new APIs. And I Programmer adds that Perl 5.38 also adds support for Unicode 15.0, adding 4, 489 characters, for a total of 149,186 characters. Other additions include enhanced regular expressions, plus defined-or and logical-or assignment default expressions in signatures.
Businesses

Amazon CEO Asks His Hollywood Studio To Explain Its Big Spending (bloomberg.com) 110

Amazon CEO Andy Jassy is taking a hard look at how much the company's Hollywood studio spends on original TV programming. From a report: In recent weeks, he has asked executives for detailed budget analyses of some of their biggest shows, according to people familiar with the matter, scrutinizing the studio's ballooning costs and mixed track record with audiences. The world's largest online retailer is engaged in a companywide cost-cutting program, with plans to eliminate at least 27,000 jobs. Across Amazon, Jassy has also jettisoned 37 different projects deemed unnecessary.

The Hollywood studio, which has spent tens of billions of dollars on original programming over the last decade, is an obvious place to look for savings. Last year, Amazon spent $7 billion on original shows, licensed programs and sports, up from $5 billion the year before. Only Netflix and Disney spend more on streaming. In the past nine months, Amazon has released at least a half-dozen pricey series that failed to deliver huge audiences. Daisy Jones & the Six, The Power, Dead Ringers and The Peripheral all cost more than $100 million to produce but failed to crack Nielsen's list of the 10 most-watched streaming programs in the US. Even The Rings of Power ($400 million-plus), a show that attracted a large audience, failed to hold on to most of its viewers over the course of the season, according to The Hollywood Reporter.

The Almighty Buck

US Might Finally Force Cable-TV Firms To Advertise Their Actual Prices (arstechnica.com) 67

The Federal Communications Commission (FCC) has proposed new rules to crack down on hidden fees charged by cable and satellite video providers. "My administration's top priority is lowering the cost of living for the middle class, and that includes cracking down on companies' use of junk fees to hide true costs from families, who end up paying more as a result," Biden said in a statement on Tuesday. Ars Technica reports: As Biden noted, the FCC "proposed a new rule that would require cable and satellite TV providers to give consumers the all-in price for the service they're offering up front." The proposed rule would force companies like Comcast, Charter Spectrum, and DirecTV to publish more accurate prices. Biden continued: "Too often, these companies hide additional junk fees on customer bills disguised as "broadcast TV" or "regional sports" fees that in reality pay for no additional services. These fees really add up: according to one report, they increase customer bills by nearly 25 percent of the price of base service."

FCC Chairwoman Jessica Rosenworcel first floated pricing transparency rules for the TV services offered by cable and satellite companies in March. That effort took a step forward on Tuesday when the commission approved a Notice of Proposed Rulemaking (NPRM) that seeks public comment on rules that would force video providers to offer accurate prices in advertising. "Consumers who choose a video service based on an advertised monthly price may be surprised by unexpected fees related to the cost of video programming that raise the amount of the bill significantly," the NPRM said. The cable and satellite TV companies' practice of listing "Broadcast TV" and "Regional Sports Network" fees separately from the advertised price "can be potentially misleading and interpreted as a government-imposed tax or fee, instead of a company-imposed service fee increase," and make it hard for customers to compare prices across providers, the FCC said.

The docket is available here, and comments will be accepted for 60 days after the NPRM is published in the Federal Register. The FCC said its proposal "would require cable operators and DBS [direct broadcast satellite] providers to clearly and prominently display the total cost of video programming service." The FCC is also seeking comment on whether it has the authority to impose similar requirements on other types of video providers. But Rosenworcel reportedly said in a congressional hearing that the FCC's authority under US law doesn't extend to streaming services.

Intel

Intel's New Font For Low-Vision Developers Is Causing Design Drama For Coders (fastcompany.com) 96

Elissaveta M. Brandon writes via Fast Company: There's a new font in town -- and it's already causing rifts on Reddit. The font is called Intel One Mono, and as its name implies, it was designed by tech giant Intel, together with New York-based type design practice Frere-Jones Type and marketing agency VMLY&R. It joins a group of monospaced fonts designed primarily for developers -- think JetBrains Mono, Fira Code, and Consolas. By definition, monospaced fonts consist of characters that have the same width and occupy the same horizontal space, making it easy for coders and programmers to tell the difference between long strings of characters. But here's where Intel One Mono stands out: it was designed with and for low-vision developers. (It's free to download on GitHub and will soon be available on Google Fonts, too.)

To ensure the font was legible and readable to its target audience, the team ran more than a dozen "live testing sessions" with visually impaired developers who were asked to write code using Intel One Mono. [...] Some of the feedback the designers received was particularly surprising. For example, some people were struggling to tell apart a capital "M" from a capital "N," most likely because both letters have two vertical stems and some diagonals in between, which can be confusing. To make the letters more legible, the designers sloped the vertical stems on the "M" so it looks close to an inverted W. "The point at which the two diagonals meet in the middle gets shifted up to make it clearly a V shape in the middle, and then the two verticals get flared out a little bit to give it slightly more differentiable shape from the capital N," says Fred Shallcrass, a type designer at Frere-Jones Type.

Similar challenges kept coming back with the "x" and the "y" which people struggled to distinguish, and the "e" and the "c." In every instance, the designers meticulously tweaked the letters to make them highly distinctive, resulting in a fairly idiosyncratic font where every glyph is as different as possible from the other -- all the way down to the curly brackets, which can best be described as extra curly. This brings us to that Reddit rift. "This font would be great were it not for those curly braces," one person wrote. "For someone that hates fonts sometimes because of curly brackets not being clear and evident, I'm officially switching to this font set because of the curly brackets," wrote another. The developers were equally torn, but the designers stand by them.
"Part of our thinking in negotiating those responses is that reinforcing the identity of any shape is not just amplifying what is unique about that letter, but also making it clearly not some other letter, so foreclosing any confusion," says Tobias Frere-Jones, the founder and lead designer at his eponymous studio. "If there's a thing the curly braces do, which is that extra back and forth movement, the parentheses don't do that, the brackets don't do that, therefore these ought to do a lot of that."
Bitcoin

Mastercard Submits Fresh Trademark Application For Crypto Tech (crypto.news) 18

According to a recently discovered patent application, Mastercard plans to develop software optimized for bitcoin and blockchain transactions. The second-largest payment-processing corporation also aims to facilitate crypto-based transactions by reducing connections between virtual asset service providers. Crypto News reports: The trademark application is a fascinating window into Mastercard's plans for the future of digital currency. Details have been revealed about creating a downloadable application programming interface (API) designed to verify transactions inside blockchain networks and ease the handling or trading of cryptocurrency. By standardizing this API software, communication between VASPs may be streamlined and crypto transactions easier. Mastercard wants to set up a platform for financial institutions to exchange customer information to verify compliance. This new step is significant for Mastercard's fast-growing presence in the cryptocurrency sector. The corporation announced its intention to offer a limited number of cryptocurrencies on its network in February 2021.
AI

Hey Alexa, What Should Students Learn About AI? (nytimes.com) 22

Long-time Slashdot reader theodp writes: While schools debate what to teach students about powerful new A.I. tools, tech giants, universities and nonprofits are intervening with free lessons," writes the NY Times reports in Hey, Alexa, What Should Students Learn About AI?
Senior Amazon executive Rohit Prasad visited a school in Boston called STEM Academy to observe an Amazon-sponsored AI lesson using Alexa, according to the article, "And he assured the Dearborn students there would soon be millions of new jobs in A.I." "We need to create the talent for the next generation," Mr. Prasad, the head scientist for Alexa, told the class. "So we are educating about A.I. at the earliest, grass-roots level."

A few miles away, Sally Kornbluth, the president of the Massachusetts Institute of Technology, was delivering a more sobering message about A.I. to students from local schools who had gathered at Boston's Kennedy Library complex for a workshop on A.I. risks and regulation. "Because A.I. is such a powerful new technology, in order for it to work well in society, it really needs some rules," Dr. Kornbluth said. "We have to make sure that what it doesn't do is cause harm."

The same-day events — one encouraging work in artificial intelligence and the other cautioning against deploying the technology too hastily — mirrored the larger debate currently raging in the United States over the promise and potential peril of A.I. Both student workshops were organized by an M.I.T. initiative on "responsible A.I." whose donors include Amazon, Google and Microsoft.

The article emphasizes that schools face a big question: Should they teach AI programming and other AI-related skills employers will seek? "Or should students learn to anticipate and mitigate A.I. harms?"

Last week, Amazon agreed to pay $25 million to settle federal charges that it had indefinitely kept children's voice recordings, violating the federal online children's privacy law. The company said it disputed the charges and denied that it had violated the law. The company noted that customers could review and delete their Alexa voice recordings. But the one-hour Amazon-led workshop did not touch on the company's data practices.
Programming

Is AI an Excuse for Not Learning To Code? (acm.org) 133

Long-time Slashdot reader theodp writes: Y Combinator founder Paul Graham last week took to Twitter to lament those who use AI or other excuses for not learning to code. "A generation ago some people were saying there was no point in learning to program because all the programming jobs would be outsourced to India," Graham wrote. "Now they're saying you don't need to because AI will do it all. If you don't want to learn to program, you can always find a reason."

BloomTech Coding Bootcamp CEO Austen Allred this week doubled-down on Graham's tweet, offering his own history of excuses people have made for not learning to code... Allred's tweet reads:

"Don't learn to code. Soon GUIs will do it all for you." — 1985

"Don't learn to code. Soon that will all be done offshore for pennies." — 2003

"Don't learn to code. Soon nocode tools will do it all for you." — 2015

"Don't learn to code. Soon AI will do it all for you." — 2023

Among the many retweeting Allred's cautionary message was Code.org, the tech-backed nonprofit that aims to make computer science a high school graduation requirement by 2030, whose CEO also replied to Graham with a reassuring tweet suggesting people's days of being able to avoid learning to code will soon be over. "Now that 27 states require that every school must teach computer science, and 7 states require a CS course to graduate high school," explained Code.org CEO Hadi Partovi, "the argument is basically behind us. Computer science won."

On a related note, this month in Communications of the ACM, a CS professor shared their own contrary opinion about the possibility of a professional programmer using AI assistants to do a better job.

"It doesn't work." I would love to have an assistant who keeps me in check, alerting me to pitfalls and correcting me when I err. A effective pair-programmer. But that is not what I get. Instead, I have the equivalent of a cocky graduate student, smart and widely read, also polite and quick to apologize, but thoroughly, invariably, sloppy and unreliable. I have little use for such supposed help...

Fascinating as they are, AI assistants are not works of logic; they are works of words. Large language models: smooth talkers (like the ones who got all the dates in high school). They have become incredibly good at producing text that looks right. For many applications that is enough. Not for programming.

Programming

92% of Programmers Are Using AI Tools, Says GitHub Developer Survey 67

An anonymous reader quotes a report from ZDNet: [A]ccording to a new GitHub programmer survey, "92% of US-based developers are already using AI coding tools both in and outside of work." GitHub partnered with Wakefield Research to survey 500 US-based enterprise developers. They found that 70% of programmers believe AI is providing significant benefits to their code. Specifically, developers said AI coding tools can help them meet existing performance standards with improved code quality, faster outputs, and fewer production-level incidents.

This is more than just people working on external open-source projects or just fooling around. Only 6% of developers said they solely use these tools outside of work. In other words, today, AI programming tools are part and parcel of modern business IT. Why has this happened so quickly? It's all about the programmers' bottom line. Developers say AI coding tools help them meet existing performance standards with improved code quality, faster outputs, and fewer production-level incidents. It's also all about simply producing more lines of code.
"Engineering leaders will need to ask whether measuring code volume is still the best way to measure productivity and output," added Inbal Shani, GitHub's chief product officer. "Ultimately, the way to innovate at scale is to empower developers by improving their productivity, increasing their satisfaction, and enabling them to do their best work -- every day."

According to the survey, "Developers want to upskill, design solutions, get feedback from end users, and be evaluated on their communication skills."

"In other words, generating code with AI is a means to an end, not an end to itself," writes ZDNet's Steven Vaughan-Nichols. "Developers believe they should be judged on how they handle those bugs and issues, which is more important to performance than just lines of code. [...] Yes, you can have ChatGPT write a program for you, but if you don't understand what you're doing in the first place or the code you're 'writing,' the code will still be garbage. So, don't think for a minute that just because you can use ChatGPT to write a Rust bubble-sort routine, it means you're a programmer now, You're not."
Programming

Google Home's Script Editor Is Now Live (theverge.com) 23

Google has launched its script editor tool, offering advanced automations for Google Home-powered smart homes. The Verge reports: Available starting Tuesday, June 13th, to those in the Google Home public preview, the script editor is part of Google's new home.google.com web interface, which also has live feeds for any Nest cams on your account. The script editor will be coming to the new Google Home app preview starting June 14th. There's no date for general availability.

Along with allowing for multiple starters and actions, the script editor adds more advanced conditions. For example, you can set an automation to run only if the TV is on and it's after 6PM but before midnight. The script editor automations are created in the new Google Home web interface, you can apply for the public preview here.

The script editor allows you to do everything you can in the Home app when setting up automations, plus "more than 100 new features and capabilities to fit your unique understanding of your home and what you want it to do," according to a blog post by Anish Kattukaran, director of product management at Google Home. This includes access to nearly 100 starters and actions, including Matter sensors -- something not currently possible in the Home app. For example, an Eve Motion sensor connected via Matter to Google Home can't currently be used as a starter for automations in the Home app but can be used as one in the script editor.
Google has some example automations that you can view here.

Slashdot Top Deals