I lost hours of my life because of the difference between Windows and Linux when it comes to handling file extensions. I had working code on my local Windows machine, and the same code would fail on the host machine at Vercel. Claude suspected that could be the problem, but we went around in circles for a long time dealing with that and various file placement issues. Plus, I’m a total newbie when it comes to GitHub (I executed my first Pull Request. Now I know that that means). But version 0.001 is now on the Web.
A commenter mentioned that Claude Code uses a file that remembers stuff, which sounds like it addresses the Alzheimer’s problem. But when I watched a video of Anthropic staffers enthusiastic about how Claude Code uses the Command Line, I decided that it’s not for me. I need a lot more handholding.
One issue with my application is that after the user submits a question or a comment, it can take several seconds for the API call to Claude to come back with new dialogue. I found the wait painful. The solution I came up with is a file that we call “banter,” which has cross-talk among the students. This does two things. First, reading it uses up the time where you would otherwise think that the app had frozen. Second, the banter helps bring out the different personalities of the students.
I don’t know if there is enough on the page to make it worth your while to try it at this point. I mean, I’ve only been at it a few days, and there was a steep learning curve, given that I’ve been out of software development since the last century. But if you want to kick the tires, leave a comment to that effect.
Right now, this is a vanity project to feature my ideas on human interdependence. But it could easily be extended to what I call “substack for professors.” Instead of being an adjunct-professor serf, you could fill out a few forms and out would come a working seminar course that you could sell for subscription revenue. Professors make students buy their textbooks. Make your students subscribe to your seminar!
I experienced the Claude API latency issue too and the better use case there is to move to Google Gemini Flash, its significantly faster and quality of responses is still very good. Here's some sample code:
from google import genai
from google.genai import types
# Create async client
client = genai.Client(
api_key=settings.GEMINI_API_KEY,
http_options={"api_version": "v1alpha"},
)
model = "gemini-2.0-flash-exp"
# Construct prompt
prompt_text = " "
gemini_prompt = {
"parts": [{"text": prompt_text}]
}
# Set up generation config to request both text and image output
generation_config = types.GenerateContentConfig(
response_modalities=["Text", "Image"],
temperature=0.7,
max_output_tokens=2048
)
# Use the async version of generate_content
response = await client.aio.models.generate_content(
model=model,
contents=gemini_prompt,
config=generation_config
)
"Professor" used to be a high prestige profession because they were experts in their field and a fount of knowledge. Today they're losing prestige because AI is more expert than they are.
But a major reason "education" works is because it transfers prestige from professor to student. The more prestigious the professor, the more the student gets out of the relationship. While AI can convey the knowledge, it can't impart prestige. Thus students will pay much less attention to AI than to a prestigious professor.
So "professor" is dying out as a profession because there no longer much prestige attached to the job. But if education means anything, it means a prestige transfer, which has to come from a human. I don't know what job description those humans will have, but they will be key to education.
AI can steal prestige from professors, but it can't award prestige to students. So I think student interest in AI seminars will not be very high.