Thursday, December 18, 2025

AI That Actually Gets Adult Life: What Working Learners Want (and Fear) from Smart Education


 

By Xi Lin

You’re 30-something, juggling a 9-to-5, two kids, and a burning need to upskill before your industry gets eaten by bots. You open an online course at 10 p.m. The first quiz feels like it was written by a robot… for robots. Sound familiar?

Now imagine the opposite: AI that knows you’re exhausted, skips the fluff, and coaches you to master a skill—on your schedule—while gamifying the grind and never judging your 2 a.m. brain fog. That’s not sci-fi. It’s what 68 working adult learners told researchers they actually want from AI-powered education.

 

The Real Talk from Real Adults

Jinhee Kim and colleagues (2025) didn’t just theorize. They studied 48 AI-redesigned e-portfolios and grilled 20 diverse grad students (architects, psychologists, data nerds—ages 24–45, all with day jobs). These weren’t ivory-tower kids. They were you.

“AI doesn’t just give me the answer—it helps me practice until I’ve mastered it. That’s what builds my confidence.” – mid-30s marketer (one participant)

 

The Wishlist: What Adult Learners Ask AI to Do

Craving

How Often It Came Up

Why It Matters

Mastery + Lifelong Habits

85% portfolios / 70% interviews

“I need to own the skill, not rent it for a grade.”

Real-World Scenarios & Games

60% / 50%

Badges > exams. Simulate my actual job.

Smart Prompting & Self-Awareness

– / 75%

Teach me to ask AI better questions.

Ethical “Friendship” with AI

– / 45%

I want trust, not blind faith.

VR Games + Chatty Interfaces

65% / –

Make it immersive, not clunky.

 

 

The Glow-Ups AI Delivered

  • Time Back: Personalized paths cut busywork. One learner: “I spent 40% less time hunting resources.”
  • 🎮 Fun Without Fluff: Gamified stealth assessments (think Duolingo for Excel mastery) boosted motivation 55%.
  • 🤝 Real Collaboration: AI grouped learners for simulated projects—suddenly, “teamwork” didn’t suck.
  • 🧠 Metacognition on Steroids: AI nudged, “Why did you choose that prompt?” → deeper thinking.
  • ❤️ Emotional Safety Net: AI as a “non-judgy tutor” reduced anxiety. One shy learner: “I asked it 47 dumb questions. Zero side-eye.”

 

But Hold Up—AI’s Dark Side

“It was so fast I stopped double-checking.” – A cautionary tale

  • Over-Reliance Risk: Some copied AI outputs verbatim. Learning? Crickets.
  • Echo-Chamber Trap: Hyper-personalized content → zero exposure to wild ideas.
  • Bias Blind Spots: AI parroting flawed data → learners spreading bad info.
  • Trust Issues: “If AI says I’m 90% proficient… am I really?”

 

The Smart Playbook: Human + AI = Unstoppable

Kim’s crew dropped a 4-step cheat code:

  1. Start with why → Use AI to personalize but force diverse viewpoints.
  2. Assess like a pro → Mix scenarios, badges, and “show me you can do it.”
  3. Train the brain → Teach prompting, ethics, and “AI, explain yourself.”
  4. Verify everything → Make cross-checking AI a graded skill.

Result? AI handles the grunt work. You handle the genius.

 

The Future Isn’t AI vs. You—It’s AI with You

This study screams: AI doesn’t replace adult learners. It upgrades them.

You’re not just cramming for a certificate. You’re:

  • Mastering skills that pay the bills
  • Building habits that last a career
  • Collaborating like a boss
  • Thinking deeper because AI took the shallow stuff

As one participant put it:

“The future isn’t the person who learns fastest. It’s the person who learns smartest—with AI as co-pilot.”

Ready to co-pilot your upskill? Your move.

 

Reference

Kim, J., Yu, S., Detrick, R., Lin, X., & Li, N. (2025). Designing AI-powered learning: adult learners’ expectations for curriculum and human-AI interaction. Education Tech Research Dev. https://doi.org/10.1007/s11423-025-10549-z

 

Thursday, November 20, 2025

AI in Construction Classrooms: How Togal AI is Changing the Way Students Estimate Projects

 


By Xi Lin

Picture this: You’re in a construction estimating class, staring at a giant floor plan PDF. There are rooms within rooms, hallways branching like mazes, and four different flooring types you need to calculate and categorize. Your team is trying to divide the work, someone is wrestling with Bluebeam’s measurement tools, and someone else is triple-checking the math in Excel.

It’s slow. It’s draining. And if one person mis-clicks, the whole estimate can collapse like a house of cards.

Now imagine instead that you upload the blueprint and — boom — the software highlights, categorizes, and tallies most of the measurements for you.

That’s the promise of Togal AI, an AI-powered estimation tool that researchers Drs. Tianjiao Zhao, Xi Lin, and Ri Na studied alongside the industry-standard Bluebeam Revu 20. They explored: Can AI actually improve learning in construction education, or does it risk making students too dependent on automation?

 

The Problem: Estimation is Hard… and Time-Consuming

Construction estimation isn’t just typing numbers in a calculator — it’s spatial interpretation, precision, grouping, materials logic, and constant rechecking. Traditional digital tools like Bluebeam are powerful but require tons of manual clicking, adjusting, verifying, and recalculating.

And in a classroom full of students new to the process?

It’s even harder.

 

The Experiment: AI vs. Manual Digital Tools

Sixty undergraduate students worked in small groups to estimate the flooring for a school building:

  • One set of students used Bluebeam first
  • The other used Togal AI first
  • Then they switched tools and compared experiences
  • Surveys + task data + reflections were analyzed

The steps included:

  1. Interpreting the floor plan
  2. Dividing group roles
  3. Measuring four types of flooring
  4. Summarizing the areas
  5. Handling a change order scenario (a real-world curveball)

 

What Happened: Togal AI Changed the Game

Task

Bluebeam

Togal AI

Overall Task Time

Long

51% faster ⏱️

Accuracy

Good

20% better 🎯

Confidence

Moderate

55% boost 💪

Change Order Task

Slow

76% faster 🚀

 

Students using Togal AI spent less time fighting the software and more time actually thinking about the project.

One student put it perfectly: “AI let me focus on the reasoning part instead of the clicking part.”

 

What Improved

Efficiency: Less manual measuring → more time discussing decisions
Accuracy: AI made fewer calculation errors than students did
Teamwork: Faster workflow meant smoother collaboration
Confidence: Students felt more capable of handling real project tasks

But Wait — There’s a Catch

AI didn’t replace critical thinking — but some students did lean on it too much.

A few said: “It was so fast that I stopped double-checking.”

Others worried: “If AI does all the work, how will we really learn estimating?”

And one student said the quiet part out loud: “I’m going to be an estimator — I don’t want a robot taking my job.”

This highlights a key takeaway:

AI should support, not replace, the core skill of understanding how estimates are made.

 

The Big Idea: Use Both, but Use Them in Order

The researchers recommend:

  1. Teach manual logic first (Bluebeam).
  2. Then introduce automation (Togal AI).
  3. Have students compare outputs — and explain discrepancies.

This builds:

  • Procedural knowledge 🧱
  • Critical verification habits 🔍
  • And modern AI fluency 🤝

 

Final Thought: AI Isn’t Here to Replace Estimators — It’s Here to Upgrade Them

This study shows that when thoughtfully integrated, AI doesn’t make learning shallow — it frees space for deeper reasoning:

Students weren’t just measuring anymore.

They were:

  • Analyzing change orders
  • Discussing resource trade-offs
  • Thinking like project managers

That’s not automation replacing skill. That’s an automation enabling skill.

As the authors show:
The future estimator is not the person who measures fastest — it’s the person who knows how to verify, interpret, and make decisions with smart tools.

 

Reference

Zhao, T., Lin, X., & Ri, N. (2025). Integrating AI in Construction Estimation Education: A Comparative Study of Togal AI and Bluebeam Revu 20. European Journal of Education, 60(4). https://doi.org/10.1111/ejed.70287

 

Wednesday, August 27, 2025

Danmaku in Online Learning: Turning Lonely Study Sessions into Lively Conversations

 


By Xi Lin

 

Picture this: You’re watching an online lecture, alone in your room, trying to stay focused while your phone buzzes temptingly nearby. Suddenly, a stream of floating comments glides across the video: someone cracks a joke about the professor’s example, another student drops a link to a helpful article, and a third points out a mistake in the slides. You’re not just watching anymore—you’re part of a conversation.

 

That’s the magic of danmaku—real-time (or pseudo-real-time) on-screen comments—that researchers Yixuan Zhu, Xi Lin, Jinhee Kim, Ahmad Samed Al-Adwan, and Na Li explored in their study on how it can boost online self-regulated learning (OSRL).

 

The Problem: Asynchronous Learning = Asynchronous Loneliness

Online self-paced courses give students flexibility, but they often strip away something essential—social interaction. Without peers to bounce ideas off or instructors to nudge you forward, it’s easy to feel isolated and disengaged. And disengagement leads to one thing: higher dropout rates.

Enter danmaku. Popular on platforms like Bilibili, it lets viewers comment directly on specific moments in a video—so even if you’re watching later, it feels like your classmates are right there with you. But could this playful, chatty tool actually help students learn better?

 

The Experiment: Danmaku Meets Self-Regulated Learning

The team surveyed 100 university students (and interviewed a few brave volunteers) who used danmaku while watching educational videos. They wanted to know:

  1. Why do students interact with danmaku?
  2. How does it affect their ability to manage their own learning?

Students’ motivations boiled down to three big ones:

  • Information and Entertainment: “Some danmaku give extra info the teacher didn’t cover… plus, funny comments make studying less boring.”
  • Social Connection: “When I see others learning with me, I feel less lonely.”
  • Self-Expression: “If I spot something missing or wrong, I’ll add my take.”

Peer pressure? Surprisingly low on the list—turns out, students didn’t feel forced to join in; they just wanted to.

 

The Good, the Better, and the “Wow, I’m Actually Engaged”

Boosted Engagement: Students stayed more focused when they could respond to danmaku in real time.
Stronger Reflection Skills: Commenting encouraged them to think critically and synthesize ideas.
Self-Efficacy Boost: Helping others or getting replies built confidence.
Community Feel: “The comment section feels like a classroom without walls.”

One student summed it up:

“If I can answer a question and get a reply from the teacher later, I feel more motivated to keep learning.”

 

The Catch

Not all interactions are equally helpful.

  • Some students ignored instructor replies if they weren’t immediate.
  • Low-value comments (spam, off-topic chatter) could distract.
  • Without guidance, discussions sometimes stayed surface-level.

 

The Big Takeaway: Interaction Feeds Motivation

The study found that responding and reflection strategies were the most powerful for boosting self-efficacy—and that self-efficacy, in turn, made learning more enjoyable. It’s a feedback loop: the more confident students feel in contributing, the more they enjoy participating, and the more they participate, the more confident they become.

 

Try This in Your Online Class

If you’re a teacher designing video-based lessons:

  1. Seed the conversation: Post thought-provoking or clarifying questions in danmaku.
  2. Highlight student contributions: Recognize helpful comments in follow-up videos.
  3. Set community norms: Encourage useful, respectful, and creative contributions.
  4. Review and adapt: Use danmaku analytics to tweak your teaching.

 

Final Thought: Danmaku Won’t Replace Teachers… But It Might Replace the “Lonely Scroll”
This study shows that danmaku isn’t just a gimmick—it’s a bridge between solitary study and social learning. In the words of one participant:

“Even if we’re not in the same room, the danmaku makes me feel like we’re learning together.”

By turning passive watching into active engagement, danmaku can make asynchronous learning feel a lot more alive.

 

Reference

Zhu, Y., Lin, X., Kim, J., Al-Adwan, A. S., & Li, N. (2025). Exploring Human Interaction in Online Self-Regulated Learning Through Danmaku Comments. International Journal of Human–Computer Interaction, 1-14. https://doi.org/10.1080/10447318.2025.2480826

 

Tuesday, July 22, 2025

Seesaw in Teacher Training: The Digital Tool That’s Shaking Up Classrooms

 


By Xi Lin

 

Picture this: A classroom where future teachers—juggling lesson plans, practicums, and caffeine—get to test-drive the same tech they’ll use with their own students. That’s exactly what happened in a recent study (Yang-Heim & Lin, 2024) where teacher candidates tried Seesaw, the popular K-12 learning platform, in their college course. The results? A mix of "This is genius!" and "But will it work with five-year-olds?"

 

The Problem: Tech-Savvy Teachers… Who Aren’t That Tech-Savvy

Today’s teacher candidates grew up with smartphones, but when it comes to using tech in the classroom, many hit a wall. As one participant put it: "I’m comfortable with Instagram, but Seesaw? That’s a whole new world." Sound familiar?

 

Enter Seesaw, a platform loved by K-12 teachers for student portfolios and parent communication. But in higher ed? Crickets. Researchers Yang-Heim and Lin (2024) decided to change that by integrating Seesaw into a literacy methods course—with a twist.

 

The Experiment: Teacher Candidates as Students First

Instead of just teaching about Seesaw, the professor had future educators use it as learners.

 

Here’s how it worked:

  1. Vocabulary Boot Camp: Students rated their understanding of a word, used Seesaw’s drawing/video tools to explain it, and peer-reviewed each other’s work. (Spoiler: Stick-figure definitions got rave reviews.)
  2. Teaching Philosophy Tracker: They documented their evolving teaching beliefs on Seesaw—think audio reflections over doodles of their "aha!" moments.

 

The goal? Let them experience Seesaw’s perks (and pitfalls) before they’re responsible for 25 kindergartners with iPads. Figure 1 shows an example of using Seesaw to display students’ work and to interact with the instructor.

 

 

The Good, the Bad, and the Cute (Because Stick Figures)

The Wins:

  • "Finally, collaboration that doesn’t suck!" Students loved peer feedback and the platform’s visual appeal.
  • "I’d use this for snow days!" Many saw Seesaw as a lifeline for hybrid/online learning.
  • "It’s like a digital scrapbook!" The mix of photos, voiceovers, and drawings made assignments feel personal.

 

The Oops Moments:

  • "Will my kindergartners even get this?" Some worried about the complexity for little learners.
  • "Where’s the teacher manual?" Many felt lost navigating Seesaw’s teacher features (since they’d only used it as students).
  • "Tech is great, but what about writing?" A few students struggled to balance screens with pencil-and-paper skills.

 

The Big Takeaway: "We Need More Practice!"

By the end, 68% of participants wanted to use Seesaw in their future classrooms—but with caveats:

  • More training"Show me how to assign work, not just submit it!"
  • Age-appropriate hacks"Maybe simplify the interface for first graders?"
  • Balance"Tech shouldn’t replace crayons… just complement them."

 

 

Try This in Your Classroom

For professors or K-12 teachers curious about Seesaw:

  1. Start small: Use it for one activity (e.g., vocabulary visuals).
  2. Play both roles: Have teacher candidates try it as students first, then as teachers.
  3. Embrace the mess: Let them critique it. ("Why is the upload button so tiny?!")

 

Final Thought: Tech Won’t Replace Teachers… But It Will Test Their Patience

Seesaw isn’t magic—but as a bridge between theory and practice? Gold. As one participant summed it up: "It’s like seeing my future classroom… minus the glitter explosions."

 

Seesaw boosted engagement and tech skills for future teachers, but they crave more training—and reassurance that it won’t replace finger painting.

 

Reference

Yang-Heim, G. Y. A., Lin, X. (2024). Teacher candidates’ perspectives on the integration of digital tools in teacher training programs: A case study of using seesaw. International Journal of Technology-Enhanced Education, 3(1) 1-19. https://doi.org/10.4018/IJTEE.362622