The Use And Misuse Of Machines That Simulate Human Cognition

Slide1 l.jpg
1 / 40
0
0
1295 days ago, 428 views
PowerPoint PPT Presentation
The Goals of Our Class . . 1) To find out about the fitting and wrong employments of automation.2) To conjecture on how mechanization will affect our reality and how we may ponder that world before it exists.3) To give you a sample of the experimental quest for knowledge.4) To find something important to you and how you collaborate with your reality.

Presentation Transcript

Slide 1

´╗┐The Use And Misuse Of Machines That Simulate Human Cognition Hall Beck, PhD Office: 215 Smith-Wright Hall

Slide 2

The Goals of Our Class 1) To find out about the fitting and improper employments of computerization. 2) To estimate on how robotization will affect our reality and how we may concentrate that world before it exists. 3) To furnish you with a case of the logical quest for learning. 4) To find something important to you and how you interface with your reality

Slide 3

What Is Automation ? Any detecting, location, data handling, basic leadership, or control activity that could be performed by people yet is really performed by a machine" (Moray, Inagaki, & Itoh, 2000) Automation is normally seen as a continuum, going from manual control to full mechanization.

Slide 4

Some Quotes About Technology But lo!! Men have turned into the instruments of their apparatuses. - Henry David Thoreau

Slide 5

Some Quotes About Technology It has turned out to be horrifyingly clear that our innovation has surpassed our mankind. - Albert Einstein

Slide 6

Some Quotes About Technology We live in a period when robotization is introducing a moment mechanical upheaval. - Adlai E. Stephenson

Slide 7

Some Quotes About Technology The main lead of an innovation utilized as a part of a business is that computerization connected to a productive operation will amplify proficiency. The second is that robotization connected to a wasteful operation will amplify wastefulness. - Bill Gates

Slide 8

Four Generations of Artificial Environments (AEs) Where we have been, the place we are, and where we are going

Slide 9

First Generation Unidirectional Communication-Information moves from the machine to the individual yet not the individual to the machine.

Slide 10

Second Generation Bidirectional Communication-Information moves from the machine to the individual and from the individual to the machine.

Slide 11

Third Generation Virtual Reality-Information moves from the machine to the individual and from the individual to the machine. In a perfect world, the engineered condition is undefined from the real condition.

Slide 12

Fourth Generation Life Simulation-The manufactured and real conditions are undefined and the individual does not know whether they are in a real or engineered world.

Slide 13

Automation Usage Decisions (AUDs) AUDs: Choices in which a human administrator has the choice of utilizing manual control or at least one levels of computerization (LOAs) to play out an errand.

Slide 14

Some AUDs Are Commonplace Checkbooks might be adjusted with a mini-computer or by mental calculation Automobiles can be set to journey control or the driver may work the quickening agent pedal Stock buys might be founded on the yield of programming projects or financial specialists may rely on their subjective appraisal of the market

Slide 15

Some AUDs Have Historic Consequences Casey Jones Pearl Harbor Three Mile Island

Slide 16

Some AUDs Have Historic Consequences USS Greenville 2000 Election

Slide 17

Types of Automation Static: Level of robotization is set a the outline organize Adaptive: Level of mechanization fluctuates relying on the circumstance

Slide 18

Optimal And Suboptimal AUDs If it is accepted that the goal is to play out an assignment, the ideal AUD is to utilize the level of control, manual through full computerization, that expands the probability of an effective result. A problematic AUD is a decision to utilize a level of control that does not amplify the probability of effectively playing out an errand.

Slide 19

Types of Suboptimal AUDs Misuse is over dependence, utilizing mechanization when manual control or a generally low LOA has a more prominent probability of accomplishment Disuse is the under use of computerization, physically playing out an errand that should most ideal be by a machine or a higher LOA.

Slide 20

Errors Resulting in Misuse as well as Disuse Recognition Errors-Operator neglects to perceive that an option, either robotized or manual, is accessible. Examination Errors-Operator mistakenly appraises the utilities of the alternatives. Plan Errors (likewise called activity blunders)- Operator purposely chooses the option that does not expand the probability of undertaking achievement.

Slide 21

Two Images of an Operator An administrator is a determined person whose sole protest is to expand errand execution An administrator's choice to depend on mechanization depends on various possibilities just a single of which is to accomplish a fruitful execution.

Slide 22

Intent Errors and Decision Aids: Doing It Your Way When Your Way Is Obviously Wrong

Slide 23

Decision Aids And Intent Errors Probably no range of computerization has demonstrated more problematical than the presentation of choice guides Beck, Dzindolet and Pierce battled that a significant part of the neglect of choice guides is because of purpose mistakes That is, administrators can't "exhortation" from a choice guide that they know would enhance their execution

Slide 24

200 "Preparing" Trials Participants saw a progression of slides on the PC screen, half of which contained a trooper in cover. Machine Absent: Pressed a "catch" to show if the warrior was available or missing Machine Present: 1) Pressed a "catch" to demonstrate if the fighter was available or truant and 2) Received the choice guide's reaction

Slide 25

100 "Test" Trials Participants saw a progression of slides on the PC screen, half of which contained a trooper in cover. Machine Absent: Pressed a "catch" to demonstrate if the warrior was available or truant Machine Present: 1) Received the choice guide's "proposal" and 2) Pressed a "catch" to show if the fighter was available or truant

Slide 26

Results

Slide 27

Operators In Machine Present Condition

Slide 28

Machine Present Condition: Estimated Accuracies

Slide 29

Rely On A Decision Aid: I Would Rather Lower My Grade Hall Beck, PhD Appalachian State University Mary T. Dzindolet Cameron University Linda G. Penetrate Aberdeen Proving Ground

Slide 30

Objectives of Study 1 To decide the degree that people would go amiss from an ideal (levelheaded) basic leadership methodology. To find if mechanization abuse or computerization neglect is the more noteworthy issue on this errand.

Slide 31

Design 3 (Relative Performance: Inferior, Equal, Superior) x 2 (Feedback: Yes, No) between-subjects. Decision to construct credit with respect to self or PC as the reliant variable

Slide 32

Trial by Trail Procedure Present photograph for .75 seconds. Member reacts by means of mouse demonstrating if the objective was in the photograph. Differentiate identifier then outputs photograph for human frame. It endeavors to figure out whether the objective is in the photograph.

Slide 33

Still More Procedure After 200 trials, members in input condition are told what number of mistakes they and the machine made. Members in the No Feedback condition don't get this data. Members are either second rate, equivalent or better than the differentiation indicator.

Slide 34

High Noon All members are told 10 more trials will be directed which will decide the additional credit that they get. Now, the member must construct additional credit in light of self or machine (AUD).

Slide 35

Persons Basing Extra Credit on Self

Slide 36

Optimal Use of Automation

Slide 37

Objectives of Study 2 To figure out whether criticism mitigates the inclination against utilizing robotization. To find what kind of criticism is best in diminishing this predisposition.

Slide 38

Forms of Feedback Trial by Trial: After every trial member told if target was in the first photograph. Combined: After 200 trials, member told add up to mistakes made without anyone else's input and complexity indicator. Earlier Results: After 200 trials, members educated that people who construct additional credit with respect to locator more often than not get more focuses.

Slide 39

Procedure and Design Same fundamental errand as initial two reviews. Indicator better than all members. 2 (Trial by Trial: Yes, No) x 2 (Cumulative: Yes, No) x 2 (Prior Results: Yes, No) between-subjects. Ten trials haphazardly chose. Decision to construct credit with respect to self or difference locator.

Slide 40

Optimal Use of Automation

SPONSORS