34
2011-11-08 () L5 for English Acquisition I B k and II B i , 2011 このスライドは次のURLから入手できます: http://clsl.hi.h.kyoto-u.ac.jp/~kkuroda/lectures/11B-KIT/KIT-2011B-L05- slides.pdf 黒田 (非常勤) Tuesday, November 8, 2011

L5 for English Acquisition I Bk and II Bi, 2011clsl.hi.h.kyoto-u.ac.jp/.../KIT-11B/KIT-2011B-L05-slides.pdf採点法 点数 完全正解 1.0 ( で表示) と 不完全解 0.5 ( で表示)

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

2011-11-08 (火)

L5 for English AcquisitionI Bk and II Bi, 2011このスライドは次のURLから入手できます:http://clsl.hi.h.kyoto-u.ac.jp/~kkuroda/lectures/11B-KIT/KIT-2011B-L05-slides.pdf

黒田 航 (非常勤)

Tuesday, November 8, 2011

連絡

✤ 休講のお知らせ

✤ 2012年1月10日(火)は休講

✤ 2012月1月9日から13日まで松江で開催される Global WordNet Associationに参加

✤ 1月31日が最終日=ボーナス試験 (L13に相当)

✤ 欠席の扱い

✤ 欠席は3回まで,4回以上の欠席は無条件落第(らしい)

✤ けど,成績が十分なら出席は問題視しません

Tuesday, November 8, 2011

講義資料

✤聴き取り用の教材は次の Web ページから入手可能✤ http://clsl.hi.h.kyoto-u.ac.jp/~kkuroda/lectures/KIT-11B.html

✤授業時間外での予習や復習に利用して下さい

✤速読に関して完全に同じことはできませんが,工夫します

Tuesday, November 8, 2011

本日の予定

✤前半60分(休憩5分を含む)

✤ L4の結果の報告

✤ L4の正解の解説

✤後半30分✤ TEDを使った聴き取り訓練

✤ Matt Cutts: Try Something New for 30 Days (3分)を通して視聴

✤ 全体の聴き取り

Tuesday, November 8, 2011

Date

L4の成績

Tuesday, November 8, 2011

採点法

✤ 点数✤ 完全正解 1.0 (◯で表示) と 不完全解 0.5 (△で表示)

✤ 評価基準

✤ 素得点 S = ◯の数 + (△の数)/2

✤ 正答率 P = ◯の数/S

✤ 成績評価用の得点: S* = 100 × S/問題の総数 (e.g., 30)

✤ 採点誤りがあるかも知れません

✤ 数え間違いや足り算間違をしますので,該当者は報告して下さい

Tuesday, November 8, 2011

出題への評価

Q1: 問題の数量Q1: 問題の数量Q1: 問題の数量Q1: 問題の数量 Q2: 問題の難しさQ2: 問題の難しさQ2: 問題の難しさQ2: 問題の難しさ

Av. Stdev Max Min Av. Stdev Max Min

1Bk

2Bi 2.43 0.53 3 2 4.57 0.79 5 3

お願い: 調査の回答は表に書いて下さい

Tuesday, November 8, 2011

L4の得点分布 1Bk

✤ 受講者数: 30人

✤ 平均点: 24.94/n [83.12] 点

✤ 標準偏差: 5.45/n [18.17] 点

✤ 最高点: 29.00/n [96.67] 点

✤ 最低点: 0.00/n [ 0.50] 点

✤ n = 20

✤ 得点グループ数=2?

Tuesday, November 8, 2011

L4の得点分布 2Bi

✤ 受講者数: 11人

✤ 平均点: 27.41/n [94.51] 点

✤ 標準偏差: 3.43/n [11.82] 点

✤ 最高点: 29.00/n [100.00] 点

✤ 最低点: 20.00/n [68.97] 点

✤ n = 29

✤ 得点グループ数=2

Tuesday, November 8, 2011

L4の正解

Tuesday, November 8, 2011

誤りの傾向

✤ 5. X involved Y✤ XにYが伴う; XにYが必要

✤ 6. X interact with Y✤ XがYとやり取りする, 相互作用する

✤ 11. X love Y✤ XがYをとても気に入る

✤ 14. X is/are persuasive = X persuade(s) you successfully✤ Xに説得力がある

✤ 15. X explored Y✤ X がY可能性を探求する; Yの可能性を追求する

✤ 25. what if S V✤ S がV(し)たらどう?

Tuesday, November 8, 2011

01/20

✤ It turns out it’s really hard to learn this or understand this from watching people because when we interact we do all of these cues automatically. We can’t carefully [1. control] them because they’re subconscious for us. But with the robot you can. And so in this video here— this is a video taken from David DeSteno’s lab at Northeastern University. He’s a psychologist we’ve been collaborating with. There’s actually a scientist carefully controlling Nexi’s cues to be able to study this question.

✤ And the [2. bottom] line is— the reason why this works is— because it turns out people just behave like people even when interacting with a robot.

Tuesday, November 8, 2011

02/20

✤ So given that key insight, we can now start to imagine new kinds of applications for robots. For instance, if robots do respond to our non-verbal cues, maybe [3. they] would be a cool, new communication technology. So imagine this: What about a robot accessory for your cellphone? You call your friend, she puts her handset in a robot, and, bam!, you’re a MeBot— you can [4. make] eye contact, you can talk with your friends, you can move around, you can gesture— maybe the next best thing to really being there, or is it?

Tuesday, November 8, 2011

03/20

✤ To explore this question, my student, Siggy Adalgeirsson, did a study where we brought human participants, people, into our lab to do a collaborative task with a remote collaborator. The task [5. involved] things like looking at a set of objects on the table, discussing them in terms of their importance and relevance to performing a certain task— this ended up being a survival task— and then rating them in terms of how valuable and important they thought they were. The remote collaborator was an experimenter from our group where they used one of three different technologies to [6. interact] with the participants.

Tuesday, November 8, 2011

04/20

✤ So the first was just the screen. So this is just like video conferencing today. The next was to add mobility, so have the screen on a mobile base. This is sort of like, if you’re familiar with any of the telepresence robots today— this is sort of mirroring that situation. And then the fully expressive MeBot.

✤ So after the interaction, we [7. asked] people to rate their quality of interaction with— with the technology, with a remote collaborator, through this technology in a number of different ways. We looked at psychological involvement— how much empathy did you feel for the other person?

Tuesday, November 8, 2011

05/20

✤ We looked at overall engagement. We [8. looked] at their desire to cooperate. And this is what we see when they use just the screen. It turns out when you [9. add] ah mobility— the ability to roll around the table— you get a little more of a boost. And you get even more of a boost when you add the full expression. So it seems like this physical social embodiment actually really makes a [10. difference].

Tuesday, November 8, 2011

06/20

✤ Now let’s try to put this into a little bit of context. Today we know that families are living farther and farther apart, and that definitely takes a toll on family relationships and family bonds over distance. For me, I have three young boys, and I want them to have a really good relationship with my— with their grandparents, but my parents live thousands of miles away, so they just don’t get to see each other that often. We try Skype, we try phone calls, but my boys are little— they don’t really wanna talk, they wanna play.

Tuesday, November 8, 2011

07/20

✤ So, I [11. love] the idea of thinking about robots as a new kind of distance play technology. So I imagine a time not too far from now— my mom can go to her computer, open up a browser and jack into a little robot. And as kind of grandma-bot, she can now play, really play, with my grand— my sons, with her grandsons, in the real world with his real toys.

Tuesday, November 8, 2011

08/20

✤ I could [12. imagine] grandmothers being able to do social plays with their granddaughters, with their friends, and to be able to share all kinds of other activities around the house, like sharing a bedtime story. And through this sort of technology, being able to be an active participant in their grandchildren’s lives in a way that’s not possible today.

✤ Let’s think about some other domains, like maybe [13. health]. So in the United States today, over 65 percent of people are either overweight or obese, and now it’s a big problem with our children as well.

Tuesday, November 8, 2011

09/20

✤ And we know that as you get older in life, if you’re obese when you’re younger, that can lead to chronic diseases that not only reduce our quality of life, but are a tremendous economic burden on our health care system.

✤ But if robots can be engaging, if we like to cooperate with robots, if robots are [14. persuasive], maybe a robot can help you maintain a diet and exercise program, maybe they can help you manage your weight.

Tuesday, November 8, 2011

10/20

✤ So sort of like a digital Jiminy— as the well-known fairy tale— a kind of friendly supportive presence that’s always there to be able to help you make the right decision in the right way, at the right time, to help you form healthy habits. So we actually [15. explored] this idea in our lab.

✤ This is a robot, Autom. Cory Kidd developed this robot for his doctoral work. And it was [16. designed] to be a robot diet and exercise coach. It had a couple of simple non-verbal skills it could do. It could make eye contact with you. It could share information looking down at a screen.

Tuesday, November 8, 2011

11/20

✤ You’d use a screen interface to enter information, like how many calories you ate that day, how much exercise you [17. got]. And then it could help track that for you. And the robot spoke with a synthetic voice to engage you in a coaching dialogue modeled after trainers and ah patients and so forth.

✤ And it would build a working alli— working alliance with you through that dialogue. It could help you set goals and track your progress, and it would help motivate you.

Tuesday, November 8, 2011

12/20

✤ So an interesting [18. question] is, does the social embodiment really matter? Does it matter that it’s a robot? Is it really just the quality of advice and information that matters?

✤ So it’s for that question, we did a study in the Boston area where we have put one of three interventions in people’s homes for a period of several weeks. One case was the robot you [19. saw] there, Autom. Another was a computer that ran the same touch-screen interface, ran exactly the same dialogues. The quality of advice was identical.

Tuesday, November 8, 2011

13/20

✤ And the third was just a pen and paper log, because that’s the standard intervention you typically get when you start a diet and exercise program.

✤ So, one of the things we really wanted to look at was not how much weight people lost, but really how long they interacted with the robot. Because the [20. challenge] is not losing weight, it’s actually keeping it off. And the longer you could interact with one of these interventions, well that’s indicative, potentially, of longer term success.

Tuesday, November 8, 2011

14/20

✤ So the first thing I want to look at is how long, how long did people interact with these systems. It turns out that people interacted with the robot significantly more, even though the quality of the [21. advice] was identical to the computer. When it asked people to rate it on terms of the quality of the working alliance, people rated the robot higher and they trusted the robot more. (Laughter)

✤ And when you look at emotional engagement, it was completely different. People would name the robots. They would [22. dress] the robots. (Laughter)

Tuesday, November 8, 2011

15/20

✤ And even when we would come up to pick up the robots at the end of the study, they would come out to the car and say good-bye to the robots. They didn’t do this with a computer.

✤ The last thing I want to talk about today is the future of [23. children’s] media. We know that kids spend a lot of time behind screens today, whether it’s television or computer games or whatnot. My sons are— they love the screen. They love the screen. But I want them to play; as a mom I want them to play like real world play.

Tuesday, November 8, 2011

16/20

✤ And so I have a new project in my group I wanted to present to you today called Playtime Computing that’s really trying to think about how can we take— [24. what’s] so engaging about digital media and literally bring it off the screen, into the real world of the child, where it can take on many of the properties, many of the properties of the real world play.

✤ So here’s the uh kind of first exploration of this idea, where characters can be physical or virtual, and where the digital content can literally come off the screen, into the world and back. I like to think of this as the uh Atari Pong of, of this blended reality play.

Tuesday, November 8, 2011

17/20

✤ But we can push this idea further. [25. What if] —✤ Nathan: Here we goe! Yay! CB: —

✤ the character itself can come into your world? It turns out that kids love it when the character becomes real and enters into their world. And when it’s in their world, they can relate to it and play with it in a way that’s fundamentally different from how they play with it on the screen.

✤ Another important idea is this notion of persistence of character [26. across] realities.

Tuesday, November 8, 2011

18/20

✤ So changes that children make in the real world need to translate to the virtual world. So here, Nathan has changed the letter A to the number 2. You can imagine maybe these symbols give the characters special powers when it goes into the virtual world. So they are now sending the character back into that world. And now it’s got number power.

✤ And then finally, what I’ve been trying to do here is create a really immersive experience for kids, where they really feel like they are [27. part] of that story, a part of that experience.

Tuesday, November 8, 2011

19/20

✤ And I really wanna a new— spark their imaginations the way mine was sparked as a little girl watching Star Wars. But I want to do more than that. I actually want them to create those experiences. I want them to be able to literally build their imagination into these experiences and make them their own. So we’ve been exploring with a lot of ideas in telepresence and [28. mixed] reality to literally allow kids to project their ideas into this space where other kids can interact with them and build upon them. I really want to come up with new ways of children’s media that foster creativity and learning and innovation. I think that’s very, very important.

Tuesday, November 8, 2011

20/20

✤ So this is a new [29. project]. We’ve invited a lot of kids into this space, and they think it’s pretty cool. But I can tell you, the thing that they love the most is the robot. What they care about is the robot. Robots touch something deeply human within us. And so whether they’re helping us to become creative and innovative, or whether they’re helping us to feel uh more deeply connected despite distance, or whether they are our trusted sidekick who’s helping us attain our personal goals in becoming our highest and best selves, for me, [30. robots] are all about people.

✤ Thank you. (Applause)

Tuesday, November 8, 2011

Date

聴き取り訓練 L5

Tuesday, November 8, 2011

Matt Cutts: Try Something New for 30 Days✤ TEDの講演

✤ 約3分

✤ 講演者

✤ Matt Cutts アメリカGoogle社の技術者

✤ 若い男性の米語の話者.発音は非常に明瞭で,それほど早口でない

✤ テーマ

✤ 良い習慣を新しく身につける/悪い習慣を止めるには

✤ 30日間毎日やれば達成できる

Tuesday, November 8, 2011

Tuesday, November 8, 2011