Thursday, November 20, 2025

AI in Construction Classrooms: How Togal AI is Changing the Way Students Estimate Projects

 


By Xi Lin

Picture this: You’re in a construction estimating class, staring at a giant floor plan PDF. There are rooms within rooms, hallways branching like mazes, and four different flooring types you need to calculate and categorize. Your team is trying to divide the work, someone is wrestling with Bluebeam’s measurement tools, and someone else is triple-checking the math in Excel.

It’s slow. It’s draining. And if one person mis-clicks, the whole estimate can collapse like a house of cards.

Now imagine instead that you upload the blueprint and — boom — the software highlights, categorizes, and tallies most of the measurements for you.

That’s the promise of Togal AI, an AI-powered estimation tool that researchers Drs. Tianjiao Zhao, Xi Lin, and Ri Na studied alongside the industry-standard Bluebeam Revu 20. They explored: Can AI actually improve learning in construction education, or does it risk making students too dependent on automation?

 

The Problem: Estimation is Hard… and Time-Consuming

Construction estimation isn’t just typing numbers in a calculator — it’s spatial interpretation, precision, grouping, materials logic, and constant rechecking. Traditional digital tools like Bluebeam are powerful but require tons of manual clicking, adjusting, verifying, and recalculating.

And in a classroom full of students new to the process?

It’s even harder.

 

The Experiment: AI vs. Manual Digital Tools

Sixty undergraduate students worked in small groups to estimate the flooring for a school building:

  • One set of students used Bluebeam first
  • The other used Togal AI first
  • Then they switched tools and compared experiences
  • Surveys + task data + reflections were analyzed

The steps included:

  1. Interpreting the floor plan
  2. Dividing group roles
  3. Measuring four types of flooring
  4. Summarizing the areas
  5. Handling a change order scenario (a real-world curveball)

 

What Happened: Togal AI Changed the Game

Task

Bluebeam

Togal AI

Overall Task Time

Long

51% faster ⏱️

Accuracy

Good

20% better 🎯

Confidence

Moderate

55% boost 💪

Change Order Task

Slow

76% faster 🚀

 

Students using Togal AI spent less time fighting the software and more time actually thinking about the project.

One student put it perfectly: “AI let me focus on the reasoning part instead of the clicking part.”

 

What Improved

Efficiency: Less manual measuring → more time discussing decisions
Accuracy: AI made fewer calculation errors than students did
Teamwork: Faster workflow meant smoother collaboration
Confidence: Students felt more capable of handling real project tasks

But Wait — There’s a Catch

AI didn’t replace critical thinking — but some students did lean on it too much.

A few said: “It was so fast that I stopped double-checking.”

Others worried: “If AI does all the work, how will we really learn estimating?”

And one student said the quiet part out loud: “I’m going to be an estimator — I don’t want a robot taking my job.”

This highlights a key takeaway:

AI should support, not replace, the core skill of understanding how estimates are made.

 

The Big Idea: Use Both, but Use Them in Order

The researchers recommend:

  1. Teach manual logic first (Bluebeam).
  2. Then introduce automation (Togal AI).
  3. Have students compare outputs — and explain discrepancies.

This builds:

  • Procedural knowledge 🧱
  • Critical verification habits 🔍
  • And modern AI fluency 🤝

 

Final Thought: AI Isn’t Here to Replace Estimators — It’s Here to Upgrade Them

This study shows that when thoughtfully integrated, AI doesn’t make learning shallow — it frees space for deeper reasoning:

Students weren’t just measuring anymore.

They were:

  • Analyzing change orders
  • Discussing resource trade-offs
  • Thinking like project managers

That’s not automation replacing skill. That’s an automation enabling skill.

As the authors show:
The future estimator is not the person who measures fastest — it’s the person who knows how to verify, interpret, and make decisions with smart tools.

 

Reference

Zhao, T., Lin, X., & Ri, N. (2025). Integrating AI in Construction Estimation Education: A Comparative Study of Togal AI and Bluebeam Revu 20. European Journal of Education, 60(4). https://doi.org/10.1111/ejed.70287

 

Wednesday, August 27, 2025

Danmaku in Online Learning: Turning Lonely Study Sessions into Lively Conversations

 


By Xi Lin

 

Picture this: You’re watching an online lecture, alone in your room, trying to stay focused while your phone buzzes temptingly nearby. Suddenly, a stream of floating comments glides across the video: someone cracks a joke about the professor’s example, another student drops a link to a helpful article, and a third points out a mistake in the slides. You’re not just watching anymore—you’re part of a conversation.

 

That’s the magic of danmaku—real-time (or pseudo-real-time) on-screen comments—that researchers Yixuan Zhu, Xi Lin, Jinhee Kim, Ahmad Samed Al-Adwan, and Na Li explored in their study on how it can boost online self-regulated learning (OSRL).

 

The Problem: Asynchronous Learning = Asynchronous Loneliness

Online self-paced courses give students flexibility, but they often strip away something essential—social interaction. Without peers to bounce ideas off or instructors to nudge you forward, it’s easy to feel isolated and disengaged. And disengagement leads to one thing: higher dropout rates.

Enter danmaku. Popular on platforms like Bilibili, it lets viewers comment directly on specific moments in a video—so even if you’re watching later, it feels like your classmates are right there with you. But could this playful, chatty tool actually help students learn better?

 

The Experiment: Danmaku Meets Self-Regulated Learning

The team surveyed 100 university students (and interviewed a few brave volunteers) who used danmaku while watching educational videos. They wanted to know:

  1. Why do students interact with danmaku?
  2. How does it affect their ability to manage their own learning?

Students’ motivations boiled down to three big ones:

  • Information and Entertainment: “Some danmaku give extra info the teacher didn’t cover… plus, funny comments make studying less boring.”
  • Social Connection: “When I see others learning with me, I feel less lonely.”
  • Self-Expression: “If I spot something missing or wrong, I’ll add my take.”

Peer pressure? Surprisingly low on the list—turns out, students didn’t feel forced to join in; they just wanted to.

 

The Good, the Better, and the “Wow, I’m Actually Engaged”

Boosted Engagement: Students stayed more focused when they could respond to danmaku in real time.
Stronger Reflection Skills: Commenting encouraged them to think critically and synthesize ideas.
Self-Efficacy Boost: Helping others or getting replies built confidence.
Community Feel: “The comment section feels like a classroom without walls.”

One student summed it up:

“If I can answer a question and get a reply from the teacher later, I feel more motivated to keep learning.”

 

The Catch

Not all interactions are equally helpful.

  • Some students ignored instructor replies if they weren’t immediate.
  • Low-value comments (spam, off-topic chatter) could distract.
  • Without guidance, discussions sometimes stayed surface-level.

 

The Big Takeaway: Interaction Feeds Motivation

The study found that responding and reflection strategies were the most powerful for boosting self-efficacy—and that self-efficacy, in turn, made learning more enjoyable. It’s a feedback loop: the more confident students feel in contributing, the more they enjoy participating, and the more they participate, the more confident they become.

 

Try This in Your Online Class

If you’re a teacher designing video-based lessons:

  1. Seed the conversation: Post thought-provoking or clarifying questions in danmaku.
  2. Highlight student contributions: Recognize helpful comments in follow-up videos.
  3. Set community norms: Encourage useful, respectful, and creative contributions.
  4. Review and adapt: Use danmaku analytics to tweak your teaching.

 

Final Thought: Danmaku Won’t Replace Teachers… But It Might Replace the “Lonely Scroll”
This study shows that danmaku isn’t just a gimmick—it’s a bridge between solitary study and social learning. In the words of one participant:

“Even if we’re not in the same room, the danmaku makes me feel like we’re learning together.”

By turning passive watching into active engagement, danmaku can make asynchronous learning feel a lot more alive.

 

Reference

Zhu, Y., Lin, X., Kim, J., Al-Adwan, A. S., & Li, N. (2025). Exploring Human Interaction in Online Self-Regulated Learning Through Danmaku Comments. International Journal of Human–Computer Interaction, 1-14. https://doi.org/10.1080/10447318.2025.2480826

 

Tuesday, July 22, 2025

Seesaw in Teacher Training: The Digital Tool That’s Shaking Up Classrooms

 


By Xi Lin

 

Picture this: A classroom where future teachers—juggling lesson plans, practicums, and caffeine—get to test-drive the same tech they’ll use with their own students. That’s exactly what happened in a recent study (Yang-Heim & Lin, 2024) where teacher candidates tried Seesaw, the popular K-12 learning platform, in their college course. The results? A mix of "This is genius!" and "But will it work with five-year-olds?"

 

The Problem: Tech-Savvy Teachers… Who Aren’t That Tech-Savvy

Today’s teacher candidates grew up with smartphones, but when it comes to using tech in the classroom, many hit a wall. As one participant put it: "I’m comfortable with Instagram, but Seesaw? That’s a whole new world." Sound familiar?

 

Enter Seesaw, a platform loved by K-12 teachers for student portfolios and parent communication. But in higher ed? Crickets. Researchers Yang-Heim and Lin (2024) decided to change that by integrating Seesaw into a literacy methods course—with a twist.

 

The Experiment: Teacher Candidates as Students First

Instead of just teaching about Seesaw, the professor had future educators use it as learners.

 

Here’s how it worked:

  1. Vocabulary Boot Camp: Students rated their understanding of a word, used Seesaw’s drawing/video tools to explain it, and peer-reviewed each other’s work. (Spoiler: Stick-figure definitions got rave reviews.)
  2. Teaching Philosophy Tracker: They documented their evolving teaching beliefs on Seesaw—think audio reflections over doodles of their "aha!" moments.

 

The goal? Let them experience Seesaw’s perks (and pitfalls) before they’re responsible for 25 kindergartners with iPads. Figure 1 shows an example of using Seesaw to display students’ work and to interact with the instructor.

 

 

The Good, the Bad, and the Cute (Because Stick Figures)

The Wins:

  • "Finally, collaboration that doesn’t suck!" Students loved peer feedback and the platform’s visual appeal.
  • "I’d use this for snow days!" Many saw Seesaw as a lifeline for hybrid/online learning.
  • "It’s like a digital scrapbook!" The mix of photos, voiceovers, and drawings made assignments feel personal.

 

The Oops Moments:

  • "Will my kindergartners even get this?" Some worried about the complexity for little learners.
  • "Where’s the teacher manual?" Many felt lost navigating Seesaw’s teacher features (since they’d only used it as students).
  • "Tech is great, but what about writing?" A few students struggled to balance screens with pencil-and-paper skills.

 

The Big Takeaway: "We Need More Practice!"

By the end, 68% of participants wanted to use Seesaw in their future classrooms—but with caveats:

  • More training"Show me how to assign work, not just submit it!"
  • Age-appropriate hacks"Maybe simplify the interface for first graders?"
  • Balance"Tech shouldn’t replace crayons… just complement them."

 

 

Try This in Your Classroom

For professors or K-12 teachers curious about Seesaw:

  1. Start small: Use it for one activity (e.g., vocabulary visuals).
  2. Play both roles: Have teacher candidates try it as students first, then as teachers.
  3. Embrace the mess: Let them critique it. ("Why is the upload button so tiny?!")

 

Final Thought: Tech Won’t Replace Teachers… But It Will Test Their Patience

Seesaw isn’t magic—but as a bridge between theory and practice? Gold. As one participant summed it up: "It’s like seeing my future classroom… minus the glitter explosions."

 

Seesaw boosted engagement and tech skills for future teachers, but they crave more training—and reassurance that it won’t replace finger painting.

 

Reference

Yang-Heim, G. Y. A., Lin, X. (2024). Teacher candidates’ perspectives on the integration of digital tools in teacher training programs: A case study of using seesaw. International Journal of Technology-Enhanced Education, 3(1) 1-19. https://doi.org/10.4018/IJTEE.362622

 

Wednesday, June 25, 2025

Talking to a Bot, Learning Like a Pro: How AI Simulated Interviews Empower Adult Learners

 

 By Xi Lin

Adult learners are known for juggling a full plate—careers, family, community responsibilities—and yet still choosing to further their education. Asynchronous online learning often offers the flexibility they need. But what about connection? Engagement? Real-world practice? That’s where AI-driven simulation interviews step in, offering a surprisingly effective bridge between flexible learning and meaningful professional development.

The Problem: Professional Learning Without Professionals

Let’s be real. Adult education courses do a great job of covering theory, but arranging real-life interviews with seasoned professionals? Not so easy. Between time zones, schedules, and professional gatekeeping, getting face time with a mentor or educator in your field can feel like a logistical nightmare. Yet, these interactions are exactly what adult learners crave—practical wisdom, career context, and that “real world” feel.

So, what if learners could simulate those conversations using AI?

Enter the Virtual Interview

In a recent study by Lin et al. (2025), adult learners enrolled in an asynchronous online graduate course were invited to interview artificial intelligence tools—yes, ChatGPT included—by assigning them roles like program developer, counselor, or instructor. Students created prompts, held “conversations” with the AI, and then reflected on what they learned.

Figure 1. Students interviewed ChatGPT

What happened next? The AI might not have passed a Turing test, but it definitely left a lasting impression.

What Changed?

After 42 students across four course sessions completed their AI simulation interviews and reflected on the experience, Lin and colleagues (2025) identified five key themes that revealed what learners really thought:

1. Quick, Accessible, Insightful

Students loved how fast the AI responded. No scheduling. No small talk. Just immediate, structured feedback. One student noted, “It was like interviewing a human…but faster and more structured.” In a world where adult learners are pressed for time, that’s gold.

2. But... Kind of Robotic

AI’s downside? Repetitive and generic answers. “Some of the responses were repeated,” one student said. Others noticed the classic disclaimer—“I am not a program planner, but…”—popping up a bit too often. Helpful? Yes. Personal? Not quite.

3. Perfect for Preparation

Students described the experience as a “jumpstart” to deeper learning. One participant shared: “What used to take hours of research can now be found in minutes.” Another said the simulation “highlighted the multifaceted nature of mentoring.” The takeaway? AI is a solid first step—just not the final one.

4. AI vs. Human: A Different Kind of Feedback

Some students appreciated AI’s unbiased tone and clarity. Others missed human warmth and nonverbal cues. One participant summed it up: “It was helpful, but a bit like talking to a robot.” Still, many agreed both forms of feedback had value—they’re just... different.

5. Trust Issues

AI’s accuracy was questioned. “How would we know if the information is correct?” one student asked. Others likened it to an interactive Wikipedia—helpful, but not gospel. In short, learners saw the tool’s strengths and its blind spots.

A Hybrid Solution: AI + Humans = Better Together

The verdict? AI isn’t a silver bullet—but it’s a powerful ally. Think of it as a personal research assistant that never sleeps. The study suggests that a hybrid model—combining AI’s speed with human mentors’ depth—could offer the best of both worlds.

The goal isn’t to replace human interaction but to amplify learning by using AI for low-stakes practice, scaffolding complex ideas, and encouraging self-paced exploration.

Implementation Tips for Educators

Inspired to try this out in your course? Here’s how to get started:

  • Design Smart Prompts: Guide students to ask open-ended, context-specific questions tied to course goals.
  • Encourage Reflection: Use discussion boards to let students share what they learned—and what they questioned.
  • Blend AI with Human Input: Follow up the AI interview with real-world guest speakers, peer feedback, or instructor-guided discussions.
  • Be Transparent: Help students understand AI’s limitations. Use it to spark critical thinking, not just information gathering.

Final Thoughts: It’s Not Just Talk

AI simulation interviews are more than a novelty—they’re a practical, scalable, and flexible way to deepen understanding, especially for adult learners navigating professional pathways. The magic isn’t in the machine—it’s in how learners engage with it, reflect on it, and ultimately connect the dots between theory and practice.

And that, in the end, might be the real lesson: meaningful learning doesn’t always require a person on the other end of the line. Sometimes, it just takes a good question and a curious mind.

Reference

Lin, X., Zhao, T., Schmidt, S. W., & Zhou, S. (2025). Using AI as a Learning Tool Through Simulation Interviews to Enhance Adult Learning. Adult Learning. https://doi.org/10.1177/10451595251345274

 

Wednesday, May 28, 2025

Bringing Online Lectures to Life: How Timeline-Anchored Comments Transform Asynchronous Learning


By Xi Lin

Asynchronous online learning is valuable for busy adult learners juggling full-time jobs, family obligations, and coursework. But let’s face it: staring at a screen alone, watching a lecture without any interaction, can make the most fascinating topic feel like watching paint dry. The real challenge? Fostering meaningful interaction in a space designed for flexibility, but not necessarily engagement.

Enter a game-changer from East Asia’s entertainment playbook: the Video Timeline-Anchored Comment (VTC) tool. Known in anime circles as danmaku (see Figure 1), this feature enables viewers to post comments tied to specific moments in a video, creating an experience that feels almost live. In one study, Lin et al. (2024) explored how this tool reshaped the way students interacted in asynchronous adult education courses.

Figure 1. Screenshot of a danmaku-commented episode.


The Problem: Asynchronous Isolation

Traditional asynchronous classes, with their text-based discussion boards and delayed responses, often leave learners feeling like they’re shouting into a void. Adult learners who are experienced, reflective, but time-strapped, need more than static screens and perfunctory posts. They crave connection, immediacy, and interaction, but the structure of asynchronous education often fails to deliver.

 

A “Live” Experience, Anytime

The VTC tool used in the study allowed students to comment directly on specific moments in instructional videos hosted on Canvas Studio. These comments, which appear like little chat bubbles, made it possible to ask questions, offer insights, and share thoughts right where the learning occurred. See Figure 2, or you can access an example here: https://youtu.be/5w3zh-vbV1A 

Figure 2. Using the VTC in Canvas Studio.


A. Comments made at 6:35 of the video jumped out at the right corner of the screen.

B. When clicking the bubble, Canvas Studio directs the viewers to the full remarks.

C. When clicking the time (i.e., 6:35), Canvas Studio directs the viewers to the video where the specific comment was made.

 

Students described the experience as “feeling like we’re all in the same room,” even though they weren’t. One learner summed it up perfectly: “Online asynchronous instruction can be lonely at times. Having the instructor commenting in the video helps you feel more like a participant in a class and not the sole learner.”

 

What Changed?

Lin et al. (2024) observed four major shifts in student behavior and experience:

  • From Passive to Active: Instead of merely consuming content, students actively engaged with it. Questions and comments sparked mid-video reflection and dialogue.
  • From Isolated to Connected: Students no longer felt alone. “It made it seem like we are all sitting in the same room having a conversation,” one said.
  • From Distracted to Focused: The dynamic commenting kept students alert and engaged—no more zoning out during long lectures.
  • From Surface-Level to Deep Learning: By discussing key moments as they occurred, students gained clarity, different perspectives, and a deeper understanding of the content.

 

Instructor as Facilitator, Not a Lecturer

The instructor’s role was key but not dominating. They offered tech support (like how to use Canvas Studio’s VTC function), moderated discussions, and chimed in to deepen content engagement. Most importantly, they created space for students to lead, present, and teach, thus empowering learners as co-creators of knowledge.

 

Implementation Tips for Educators

Consider trying this out in your online course. Lin et al. (2024) offer some practical advice:

  • Choose the Right Tools: Platforms like Canvas Studio support timeline-anchored commenting. Get familiar with its features first.
  • Make It Intentional: Tie comments to learning goals. Ask students to post questions, insights, or clarify misunderstandings at key moments.
  • Train for Success: Not all students are tech-savvy. Offer demos, how-to guides, and examples of quality comments.
  • Encourage Dialogue: Require students to respond to each other’s comments to foster deeper interaction.
  • Reflect and Adjust: Gather feedback and be open to making adjustments based on students’ preferences and technical comfort.

 

Final Thoughts: Small Tool, Big Impact

The VTC tool is more than a flashy feature. It is a bridge between isolated learners and a thriving online learning community. A bridge between content and cognition. And most importantly, a bridge between asynchronous convenience and synchronous-like engagement.

The next time you prepare an online lecture or discussion, consider incorporating a layer of real-time (or pseudo-real-time) interaction. It might just turn your virtual class into a vibrant learning community—one comment bubble at a time.

 

Reference

Lin, X., Sun, Q., & Zhang, X. (2024). Increasing student online interactions: Applying the video timeline-anchored comment (VTC) tool to asynchronous online video discussions. International Journal of Human–Computer Interaction, 40(19), 5910-5922. https://doi.org/10.1080/10447318.2023.2247554