

Artificial intelligence has arrived in our classrooms. And at El Horizonte School and other schools, we, teachers and students, have invited it with open minds and many questions to solve together.
Welcome to the school. Please come in quietly, we're working.
Look there: seated on chairs arranged in a semicircle, a second-year ESO teacher accompanies the students in a Socratic dialogue: "Do you think it's right for a machine to make decisions for you?" "Where does the data come from that feeds this machine's decision-making capacity?" "Is this data exhaustive? Is it biased?"
Now turn your head. In the lab, you'll find third- and fourth-year secondary school students abducted by their computers: they're creating a nascent AI. They feed it data, program algorithms, train the neural network, test it, and iterate on it. They've taken biases into account, or so they think, because they're 16-year-old boys and girls, and they still know little about their biases. What do we adults know about ours? But there they are, happily creating the embryo of an AI.
And if you continue down the corridor, you will see about fifteen first and second year ESO students sitting at banquets: they are talking about The Blue Book of Nebo, by Manon Steffan Ros, published by Periscopio. They discuss the effects of a possible post-apocalyptic and post-technological world. For the protagonists, technology is once again tied to hands and the making of objects to feed themselves and find shelter, above all. The debate among the students is binary: some believe that the post-apocalyptic and post-technological world ofThe Blue Book of Nebo It's dystopian. Others, utopian. "We would be free from the slavery of technology. Children would play in the streets again," declares one young woman. "But we wouldn't have video games or amusement parks," argues another.
And now one last scene: do you see that first-year ESO student? The one in the back row. He's grabbed the computer, opened ChatGPT, and is asking for ideas to write a scary story in the style of The Tell-Tale Heart by Poe, which they have previously read in class. Now he stops: he is thinking about the instructions; prompts, they say. Look, he continues: "I want the story to be no more than 1,000 words. I want it to be written in the third person and in the past tense. I want the protagonist to be an 18-year-old boy. I want the story to be scary, but not violent. I want it to take place in a castle. And I want the protagonist to also be obsessed with a strange noise." Furtively, he feels that they will catch him out and boo him without knowing why. What are the limits, or how to use it, but that he shouldn't suffer, that the people and his teachers are also learning with him, that they are developing, that they are open to dialogue, to making mistakes together, to not letting themselves be overcome by fear or childish denial?
This is the key. Generative artificial intelligence is already in school. From honest curiosity and joy, from the desire to be critical and creative, we begin to understand it and love it a little. No pessimism: teachers are optimists by nature, possibilistic, rigorous, and curious; otherwise, we'd be doing something else. I promise you, we're not here for the money. So let's open the doors to philosophy, ethics, literature, technology, mathematics, programming, history... let's approach generative artificial intelligence from a transversal perspective. As Plutarch said: "The mind is not a vessel to be filled, but a fire to be lit," and our students' minds have been on fire for a while now.