●Stories
●Firehose
●All
●Popular
●Polls
●Software
●Thought Leadership
Submit
●
Login
●or
●
Sign up
●Topics:
●Devices
●Build
●Entertainment
●Technology
●Open Source
●Science
●YRO
●Follow us:
●RSS
●Facebook
●LinkedIn
●Twitter
●
Youtube
●
Mastodon
●Bluesky
Become a fan of Slashdot on Facebook
Forgot your password?
Close
wnewsdaystalestupid
sightfulinterestingmaybe
cflamebaittrollredundantoverrated
vefunnyunderrated
podupeerror
×
180598916
story

Posted
by
msmash
ry 16, 2026 @09:00AM
from the hypocritical-views dept.
theodp writes: Code.org, the nonprofit backed by AI giants Microsoft, Google and Amazon and whose Hour of AI and free AI curriculum aim to make world's K-12 schoolchildren AI literate, points job seekers to its AI Use Policy in Hiring, which promises dire consequences for those who use AI during interviews or take home assignments without its OK.
Explaining "What's Not Okay," Code.org writes: "While we support thoughtful use of AI, certain uses undermine fairness and honesty in the hiring process. We ask that candidates do not [...] use AI during interviews and take-home assignments without explicit consent from the interview team. Such use goes against our values of integrity and transparency and will result in disqualification from the hiring process."
Interestingly, Code.org CEO Partovi last year faced some blowback from educators over his LinkedIn post that painted schools that police AI use by students as dinosaurs. Partovi wrote, "Schools of the past define AI use as 'cheating.' Schools of the future define AI skills as the new literacy. Every desk-job employer is looking to hire workers who are adept at AI. Employers want the students who are best at this new form of 'cheating.'"
You may like to read:
Amazon Is Buying America's First New Copper Output In More Than a Decade
Hard Drive Prices Have Surged By an Average of 46% Since September
This discussion has been archived.
No new comments can be posted.
Load All Comments
Full
Abbreviated
Hidden
/Sea
Score:
5
4
3
2
1
0
-1
More
Login
Forgot your password?
Close
Close
Log In/Create an Account
●
All
●
Insightful
●
Informative
●
Interesting
●
Funny
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
byPseudonymous Powers ( 4097097 ) writes:
Before everyone starts dunking on this, remember that it's possible, even likely, that Code.org used an LLM to write their AI-use policy.
And then resume dunking.
twitter
facebook
byDarkOx ( 621550 ) writes:
Anyone want to take bets on how long it takes their corporate masters to make them square their policy with their marketing efforts like stuffing co-pilot everywhere they can?
byjw867 ( 97358 ) writes:
They all will use AI to rate your interview and any work you produce for the interview. They will also use any and all of your work product from your interview to train their AI models.
byPseudonymous Powers ( 4097097 ) writes:
Even with in-person interviews, you can't be 100% certain that they aren't coding guided by the vibrations of an anal bead.
byevslin ( 612024 ) writes:
> We ask that candidates do not [...] use AI during interviews and take-home assignments without explicit consent from the interview team.
Assign a take-home task as part of your interview and you're dead to me, so I guess we're even?
twitter
facebook
byBerkyjay ( 1225604 ) writes:
Take home tests are far better than in-person, proctored coding tests.
byradarskiy ( 2874255 ) writes:
Take home tests are far better than in-person, proctored coding tests.
That doesn't mean that either are an effective part of the interview process.
Parent
twitter
facebook
byBerkyjay ( 1225604 ) writes:
Well yeah, coding tests in general are fucking stupid and pointless. If someone has a resume and prior job experience, let me talk to their former boss. If someone is a new grad, let me talk to their professor. But if I HAVE to give a test I am giving someone a one day project. I want to see if they can accomplish a task and I really don't care how efficient their leetcode game is. At the end of the day, results are what pay the bills. Not how clever your code syntax is.
● threshold.
byCubicleZombie ( 2590497 ) writes:
I'm a gray beard developer, so I've been coding WAY longer than AI has been around. I don't vibe code in any way, but when I'm coding in VS Code with Copilot enabled, I'll be typing a line and AI figures out what I'm doing and completes the line. TAB. Next line, AI figures out what I'm doing and completes the line. TAB.
I'm getting used to it and maybe a little dependent on it. Put me in a coding interview without AI, and it might be harder than it used to be.
twitter
facebook
bytimeOday ( 582209 ) writes:
Yeah, there's just no reason to memorize all these huge API's any more. Is that the same as letting AI take over the design of your code? Nope.
bypete6677 ( 681676 ) writes:
Coding without any AI assistance will be much like LeetCode in that it will be a skill that developers must learn in order to pass coding interviews, which have nothing to do with how coding is done on the job.
bySlashbotAgent ( 6477336 ) writes:
A completely reasonable and intelligent policy that does not, in anyway, invalidate their greater message or goal.
Let's try to make a big deal out of of this outrage clickbait. /s
Analogy: 'Winchester firearms does not tolerate the use of firearms as a coercive tool during the hiring process.'
It's not a double standard.
twitter
facebook
byFictionPimp ( 712802 ) writes:
We do not allow candidates to use AI, IDEs, or high level languages to perform their coding tests. In their real jobs we know they use these things, but why should the hiring process reflect the skillset we require?
twitter
facebook
bypete6677 ( 681676 ) writes:
Nothing is really changing here. When has the hiring process ever really reflected the skillset required?
bySpinyNorman ( 33776 ) writes:
Exactly.
This seems like laziness from Code.org.
Rather than rejecting candidates that use AI, how about instead adapting your candidate evaluation process to evaluate if they know how to productively use AI coding tools (beyond just "vibe code me an app to do X").
Using AI isn't cheating - it's a tool that as a developer you need to learn to use.
It's like rejecting a candidate that is using a calculator to do math, or for using Google to search for an algorithm, rather that doing math with pencil and paper an
byViol8 ( 599362 ) writes:
Perhaps they want to hire someone who has a deeper understanding of algorithms than just cut-n-pasting them so when a real knarly problem shows up they can actually solve it rather than just saying "Uh, AI can't fix it for me, dunno what to do"
Perhaps you'd be happy if your doctor just asked ChatGPT to diagnose you, i mean , he's just using a tool right, whats the problem?
byViol8 ( 599362 ) writes:
Which part of those use AI exactly?
bySpinyNorman ( 33776 ) writes:
It's easy enough to check if the understand the code they've submitted, whether they wrote it or not, by getting them on a video call and asking questions about the code.
If they used AI to write the code, but show a level of understanding of it, including motivation for writing that way, alternatives, etc, equal to if they had written it themselves, then I don't see a problem.
bySpinyNorman ( 33776 ) writes:
People lying on their resumes is nothing new, but truth comes out when you talk to them or put them in front of a whiteboard.
IMO a phone interview or at-home assignment should anyways only be a screening step, but if you are going to hire without ever having interviewed the person face to face, and in front of a whiteboard, then you better be good enough at interviewing (doesn't take much) to weed out the liars.
●ent threshold.
●nt threshold.
bythegarbz ( 1787294 ) writes:
but why should the hiring process reflect the skillset we require?
The skillset requires the knowledge to see that AI is often wrong. It's not at all unreasonable to test the applicant rather than whatever OpenAI's latest creation is in an interview.
byOrangeTide ( 124937 ) writes:
Why should anyone play their gatekeeper games? Just move on, they're basically irrelevant anyway
bygreytree ( 7124971 ) writes:
code.org: Do what we say, not what we do.
Also code.org: We'll pay you not to teach boys.
Fuck code.org
bymrbester ( 200927 ) writes:
You use AI in the hiring process, you're dead to me.
Remember, a candidate also interviews you and your company. If you think it is a one-way process, you just failed the interview.
byhdyoung ( 5182939 ) writes:
why universities frequently ignore what industry and society SAYS it needs. One month, why havent we put AI into literally every course, including English and history. What, our toilet paper dispensers arent AI? Whats wrong with us dinosaurs? Next month, the companies consider AI to be cheating and blame the educational system for being too AI friendly. One month, industry demands Java and python skills. The next month, theyre trashing the universities because we havent produced millions of AI programmers with 10 years of skills in a field thats barely existed for 24 months. Ive seen this cycle multiple times for cs skillsets. Its happened with law degrees. Before that, society went completely hot and cold on the entire field of aerospace engineering -growing it up at an unsustainable rate and then cratering an entire generation of engineers that took the bait. Im not saying universities should ignore societal and industry needs, but our clients dont think very far into the future.
twitter
facebook
byhjf ( 703092 ) writes:
believing AI is not here to stay is just delusional. and universities better start adjusting accordingly.
I'm tired of hearing that this is just "a bubble that's going to burst anytime soon". You know what was also a bubble, that also burst? The fucking internet.
And now we can't even imagine our lives without the internet.
byhdyoung ( 5182939 ) writes:
I totally agree that universities need to incorporate AI. The questions are "how fast" and "how much" and "how deep". My position is that the universities should aim to provide enough AI experts to fill the workforce needs that will exist AFTER the bubble pops, and the corporate world can bite the bullet and repurpose current CS employees to meet the demand transient.
Any CS program that starts churning out hundreds/thousands of AI-specialized CS majors, in order to meet the current bubble-level demand,
bydavidwr ( 791652 ) writes:
Maybe the real "test" is your ability and willingness to follow instructions even if you think they are inefficient/wrong/not-the-way-I-would-do-it.
Those are things management is usually looking for even if they won't say so out loud.
Or take the cynical view: Maybe management is looking for people who can lie and not be caught, with the goal of promoting to the C-suite.
bysevenfactorial ( 996184 ) writes:
I'm a CS professor. The problem with AI is that allowing it and disallowing it both lead to awkward outcomes.
Suppose you allow it. Then what are you going to ask students to do? Implement bubblesort? No. That would be pointless -- AI trivially generates all boilerplate code. Of course Chegg long ago broke the oldest and best coding assignments, which consisted of implementing classic algorithms from pseudocode. Still, all traditional undergrad assignments are out the window.
What *can* you ask students to implement? Realistically, they should be able to do just about anything. The order to "Recreate Facebook" is a valid 2 day HW assignment. But that's so broad that it's ungradable. And the students' ability to do the task means very little about their inherent ability.
So, suppose you *don't* allow the use of AI. Then almost everyone will use AI anyway. Now you're not a professor, you're a detective, and everyone in your course is a suspected criminal. If students are smart AI work is easily disguised. Then who are you giving A's to? Cheaters. Solution: Give everyone an A. Educational value for most students, who need to be threatened and cajoled to do work: Zero.
Does that mean I'm saying that AI makes everyone a genius and you can't tell the difference anyway? No. The pinch happens when you get out on the bleeding edge and try to do something truly novel. But that is not how instruction of any kind traditionally works. Things at the bleeding edge are incomprehensible to students. Asking students in CS 101 to blaze a new trail is a stupid assignment.
Conclusion: Things are very broken and many students are in trouble. The only thing you can really do to educate the typical person (who requires cajoling and threats) is to lock them in a Faraday cage for four years (and they would probably still cheat). On the other hand, for the *very* rare individual who is self motivated and just wants to learn, it is a golden age.
I suppose we should really be teaching students how to use AI to educate themselves.
twitter
facebook
byTschaine ( 10502969 ) writes:
I wonder how long until people who primarily write in high-level programming languages will be seen the way we now see people who primarily write in assembly language.
One key difference though: it's pretty rare that you have to switch from C# (for example) to assembly in the same project, but it's 100% normal to switch from English to C# when the LLM generates code that doesn't do what it was supposed to.
Maybe stop giving programming assignments, and instead give in-class exams to test your student's unders
byinvisiblefireball ( 10371234 ) writes:
The hypocrits are pretty emboldened these days. This is absolutely pompous.
byallo ( 1728082 ) writes:
Isn't HR already using AI for interviews? Sometimes even trying to get data about you from your facial traits? It would only be fair play if the interviewed person could also use AI...
byJoshZK ( 9527547 ) writes:
The cat's out of the bag
The genie is out of the bottle
The toothpaste is out of the tube
You can’t put the toothpaste back in the tube
The secret’s out
The damage is done
That ship has sailed
What’s done is done
No turning back now
The bell has been rung
The word is out
It’s in the wild now
The horse has left the barn
The horse has bolted
The arrow has left the bow
The die is cast
The Rubicon has been crossed
The ink is dry
The fuse has been lit
The floodgates are open
Pandora
byBoogieChile ( 517082 ) writes:
You want people who know how it works. That way, when the AI throws out garbage (and it *will* throw out garbage), it doesn't make it into production.
● threshold.
There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.
●
509 commentsTrump Orders Treasury Secretary To Stop Minting Pennies
●
491 commentsTrump Signs Order Aiming To Close the Education Department
●
464 commentsTrump Opens Trade Talks Window While Threatening China With Steeper Tariffs
●
381 commentsScott Adams, Creator of the 'Dilbert' Comic Strip, Dies at 68
●
361 commentsChina Halts Rare Earth Exports Globally
Hard Drive Prices Have Surged By an Average of 46% Since September
Amazon Is Buying America's First New Copper Output In More Than a Decade
Slashdot Top Deals
Slashdot
●
●
ofloaded
●
Submit Story
If A = B and B = C, then A = C, except where void or prohibited by law.
-- Roy Santoro
●FAQ
●Story Archive
●Hall of Fame
●Advertising
●Terms
●Privacy Statement
●About
●Feedback
●Mobile View
●Blog
Do Not Sell or Share My Personal Information
Copyright © 2026 Slashdot Media. All Rights Reserved.
×
Close
Working...