Dhruv Patel’s buddy is aware of what he is aware of, shares his views and even mimics his speech patterns.
That’s as a result of Patel created his pal, who is just not a human however an “agentic self” — an AI persona that not solely completes duties however truly displays an individual’s values, voice and objectives.
“I want my agent to be a riffing buddy or a friend I can bounce ideas off of,” mentioned Patel, a third-year laptop science pupil at Arizona State University.
“I want it to think like I do, but with all the computational power and all the other intellectual capabilities that I don’t have behind it.”
Patel is a part of a new ASU class referred to as “The Agentic Self,” taught by will.i.am, the musician, tech founder and philanthropist. Nearly 80 students from all kinds of majors and age teams are taking the category — in each Tempe and Los Angeles — and studying methods to create these private brokers.
The frontman for the pop group Black Eyed Peas and a professor of follow in The GAME School at ASU, will.i.am mentioned that AI can compute rapidly however nonetheless requires human enter.
“Agentic is the next step, where the agent is able to do tasks and workflows on your behalf. You set it on its course, and it would reason, research, browse, generate — all autonomously,” he mentioned.
Unlike ChatGPT or Claude giant language fashions, the place knowledge is saved in a public cloud, the students’ brokers are non-public and fully theirs. The brokers can purpose, adapt and attain sophisticated duties — with minimal enter and with the intentions of their creators.
will.i.am is the founder and CEO of FYI.AI, an AI-powered expertise platform designed for artistic groups headquartered in Los Angeles. The students in the Agentic Self course are creating their brokers with the EDU.FYI platform, a collaboration between ASU and FYI.AI, and the course is a take a look at case for the platform. Eventually, EDU.FYI might be out there to all school and students at ASU with the objective of bringing it to different instructional establishments as nicely.
The partnership occurred after will.i.am met ASU President Michael Crow final 12 months.
“President Crow pitched, ‘Hey, those agents that you’re building at FYI? What about building them for professors and students?’ That vision is the reason why I’m here,” will.i.am mentioned.
The college has embraced AI, exploring different ways it can enhance learning and discovery. In 2024, ASU grew to become the primary college to accomplice with OpenAI and final 12 months expanded the collaboration to deliver ChatGPT Edu with GPT-5 to each pupil, school member, researcher and employees member for free of charge to the person. In addition, ASU affords two undergraduate and 10 graduate levels specializing in AI, together with a one-of-a-kind diploma in synthetic intelligence engineering.
will.i.am calls the ASU alliance “forever learning” — a approach to equip students in any main with the talents they want for a world powered by AI.
“It’s a moral compass — this urgency to teach, inspire, encourage, mentor and motivate folks to build ethical systems that reflect themselves,” he mentioned.
“How do you equip people with an agent when they’re being replaced by agents? So not only can they keep their job, but have the ability to utilize the power of their own agent to solve problems and create jobs themselves?”
Peter Murrieta, an Emmy-winning producer and interim affiliate dean of the Herberger Institute for Design and the Arts, is one among three ASU professors who’re facilitating the course with will.i.am. Murrieta mentioned that artists, together with himself, are rightly suspicious of AI after the leisure business was disrupted by streaming and social media for the revenue of firms.
“This class is asking the question, before we start building our own agents, ‘What do we believe in, what do we care about? What would we never give up?’ And the idea of creating an agent for yourself where you own all the materials you’re putting into that agent makes it much easier to move forward in the world,” mentioned Murrieta, who can also be deputy director of the Sidney Poitier New American Film School.
“Anything I tell it, I own, and nobody’s scraping and learning off of it. And it just feels like the only way to move forward as an artist.”
Learning from an artist
Patel mentioned he instantly registered when the category opened.
“I’m a musician, and I grew up on the Black Eyed Peas,” he mentioned.
“It’s one thing to learn agentic AI from a computer science professor talking about all the components or the hardware side of things. It’s another thing to learn it from this creative genius who can explain it to me in a way that only an artist could.”
Jeremiah Holland, a graduate pupil in ASU’s Narrative and Emerging Media program in Los Angeles, additionally was intrigued by taking a category with the well-known will.i.am.
“But what I really like about the class is that he’s teaching us how to use AI to work on our behalf, because right now AI is owned by the big corporations that get to use your data and you don’t have any say. With this class and with this AI system, the data is under our control and we get to decide how it’s used,” mentioned Holland, who earned a Bachelor of Fine Arts in movie media manufacturing with an emphasis in screenwriting and producing from the Poitier School.
He has used different AI chatbots, together with ChatGPT and Gemini, to assist with fact-checking and grammar.
“When I’m writing a script, if I’m having a hard time with an idea or if I’m writing a character, I’ll ask, ‘Give me some name suggestions,’” he mentioned.
“But this is the first time I get to build it from the ground up and decide how it thinks and how I want it to help me. Being able to see the back end of how these systems are built is one reason I’m glad I took this course.”
Sean Hobson, chief design officer for EdPlus at ASU and a professor in the Mary Lou Fulton College for Teaching and Learning Innovation, is likely one of the course facilitators and has been key in supporting the partnership with will.i.am.
“Working alongside will.i.am and the team has been one of the most creatively expansive experiences of my life,” Hobson mentioned. “His vision for what students need and can become to thrive in the future pushes everyone around him to think differently.
“Pair that with a university built to move at the speed of the moment, and you get something rare — a course where students aren’t just studying the future, they’re building it. Learning by doing — that’s the whole point.”
Developing with ethics in thoughts
The EDU.FYI platform is powered by Nvidia, which has donated 80 graphics processing items for the students to make use of. The book-sized gadgets maintain all of the students’ content material non-public, so it’s not out there in the cloud.
will.i.am alternates visits between the Tempe campus and a new state-of-the-art classroom he constructed in his Los Angeles studio — so each week, half the students see him in individual, and the opposite half see him on a video stream.
At a latest class in Tempe, the students shared their brokers’ voices, and will.i.am inspired them to speak freely when inputting their audio.
“If you’re reading when you’re talking to it, (your agent) is going to sound like you’re reading. If you’re just messing around with your friends, have them interview you or record a podcast, and talk like you’re naturally having a passionate conversation about the world,” he mentioned.
“That’s when you’re truly going to get the agent to not sound robotic. Because you don’t show up to the world reading a script.”
Heavy-hitter friends
Thanks to will.i.am’s skilled relationships, students in the Agentic Self class have been in a position to be taught from a number of expertise innovators as visitor lecturers. At the midpoint of the course, students have heard from:
Ethical design can also be a precedence of the course.
Pavan Turaga, founding director and professor in The GAME School and one of many course facilitators, mentioned the students are requested to think about a future in which on a regular basis actions like job interviews would require an agent.
“In that future, the ethical conversations are, ‘What are you willing to give your agent as information about yourself? And what are you not willing to give?’ Because who knows how that agent will behave after that?” Turaga mentioned.
Crow was a visitor lecturer at a latest class, and will.i.am requested him:
“These students are about to design and deploy agents in a world where the government, the corporations, the rules, the ethics, they’re all lagging behind the tech. What is one concrete design principle you would ask every student here to hardwire into their agentic self so that 10 years from now, we can honestly say these systems expanded human freedom, creativity and dignity, instead of just optimizing for profit and control?”
Empathy, Crow mentioned.
“The worst thing to be is intelligent or intelligently enhanced and be without empathy. Because you’re then consumed by your own self, your own understanding,” Crow mentioned.
Patel mentioned one pupil not too long ago requested if the brokers now maintain the identical biases as their human creators.
“It was a great question,” Patel mentioned. “More than the tech part of making the agent, I’ve been enjoying the philosophical, ethical and moral conversations we’re having.”
Holland recommends that folks be taught all they will in regards to the expertise.
“Instead of being afraid of AI and what it’s going to do, learn about it and learn how it can enhance you to grow your capabilities.”
The course is deliberate to be provided once more in future semesters.

