Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
IETTI 221 – C Computer Programming (EXPERIMENTS)
NAME: Last update 26 December 2020
# Due by Experiment Score01 Session #12 Demonstrate properties of series networks02 Session #12 Demonstrate properties of parallel networks03 Session #12 Demonstrate KCL in a series-parallel resistor circuit04 Session #12 Demonstrate KVL in a series-parallel resistor circuit05 Session #12 Demonstrate voltage divider loading06 Session #12 SPICE simulation of a series-parallel resistor circuit07 Session #12 C/C++ simulation of a series-parallel resistor circuit
08 Session #24 Choose your own09 Session #24 C/C++ simulation of diode characteristic function10 Session #24 Demonstrate resistor-capacitor time delay11 Session #24 SPICE simulation of a resistor-capacitor time delay12 Session #24 C/C++ simulation of a resistor-capacitor time delay13 Session #24 C/C++ simulation of a resistor-inductor time delay14 Session #24 Demonstrate astable 555 circuit
15 Session #36 Choose your own16 Session #36 Demonstrate relay-based logic circuit function17 Session #36 Demonstrate IC logic gate function18 Session #36 C/C++ simulation of an elementary logic function19 Session #36 C/C++ simulation of a combinational logic function20 Session #36 C/C++ simulation of a latching logic function21 Session #36 Demonstrate IC logic implementing arbitrary truth table
22 Session #48 Choose your own23 Session #48 C/C++ simulation of AC sinusoidal source (time-domain)24 Session #48 Demonstrate passive filter circuit25 Session #48 C/C++ simulation of a passive filter circuit (frequency-domain)26 Session #48 SPICE simulation of a polyphase circuit (phasor-domain)27 Session #48 C/C++ simulation of a polyphase circuit (phasor-domain)28 Session #48 C/C++ simulation of harmonic series (time-domain)
29 Last day C/C++ simulation of arbitrary waveform30 Last day Demonstrate single-transistor BJT audio amplifier31 Last day SPICE simulation of BJT amplifier32 Last day Demonstrate multi-transistor audio amplifier33 Last day Demonstrate precise voltage gain using an opamp34 Last day Demonstrate opamp with discrete power stage35 Last day Demonstrate sinusoidal oscillator circuit and output spectrum
# Due by Troubleshooting activity Score36 Session #12 Computer simulation: unloaded voltage divider37 Session #24 Computer simulation: loaded voltage divider38 Session #36 Real circuit: combinational logic39 Session #48 Real circuit: passive filter40 Last day Real circuit: multi-stage audio amplifier
41 Last all-lab day Lab clean-up activities
1
This is a lab course focused on scientific experimentation, where students both devise and conducttheir own experiments to explore principles. The instructor certifies each experimental stage for properformat, documentation, and accuracy by a checklist. The pre-run stage consists of the student writing anhypothesis (i.e. what they think will happen), an experimental procedure, and an assessment of risks alongwith appropriate mitigations for those risks. When all pre-run objectives are met, the student then runsthe experiment and collects data. That recorded data is shown to the instructor, along with the student’swritten analysis of the data and summary of the experiment. The instructor challenges the student with aquestion related to their experiment, which the student should be able to easily answer. Videorecording ofall experiments is encouraged. Experiments are listed in suggested order.
The goal of all scientific experimentation is learning. As such, there is really no such thing as a “bad”hypothesis – even disproven hypotheses offer valuable lessons. What matters most of all is the student’sanalysis and summary where they draw important lessons from the experiment.
Experiment scores reflect the thoroughness and accuracy of your presented work. Work that is completeand accurate when presented to the instructor will receive a 100% score. The instructor’s role is to certifyyour completed work, with re-work resulting in deductions to your score.
EET Program Learning Outcomes
(1) COMMUNICATION and TEAMWORK - Accurately communicate ideas across a variety of media(oral, written, graphical) to both technical and non-technical audiences; Function effectively as a member ofa technical team.
(2) SELF-MANAGEMENT – Arrive on time and prepared; Work diligently until the job is done; Budgetresources appropriately to achieve objectives.
(3) SAFE WORK HABITS – Comply with relevant national, state, local, and college safety regulationswhen designing, prototyping, building, and testing systems.
(4) ANALYSIS and DIAGNOSIS - Select and apply appropriate principles and techniques for bothqualitative and quantitative circuit analysis; Devise and execute appropriate tests to evaluate electronicsystem performance; Identify root causes of electronic system malfunctions.
(5) PROBLEM-SOLVING – Devise and implement solutions for technical problems appropriate to thediscipline.
(6) DOCUMENTATION – Interpret and create technical documents (e.g. electronic schematic diagrams,block diagrams, graphs, reports) relevant to the discipline.
(7) INDEPENDENT LEARNING – Select and research information sources to learn new principles,technologies, and/or techniques.
file eet_outcomes
2
Values
This educational program exists for one purpose: to empower you with a comprehensive set of knowledge,skills, and habits to unlock opportunities in your chosen profession. The following values articulate personalattitudes guaranteed to fulfill this purpose, and the principles upon which this program has been designed.
Ownership – you are the sole proprietor of your education, of your career, and to a great extent yourquality of life. No one can force you to learn, make you have a great career, or grant you a fulfilling life –these accomplishments are possible only when you accept responsibility for them.
Responsibility – ensuring the desired outcome, not just attempting to achieve the outcome. Responsibilityis how we secure rights and privileges.
Initiative – independently recognizing needs and taking responsibility to meet them.
Integrity – living in a consistently principled manner, communicating clearly and honestly, applying yourbest effort, and never trying to advance at the expense of others. Integrity is the key to trust, and trust isthe glue that binds all relationships personal, professional, and societal.
Perspective – prioritizing your attention and actions to the things we will all care about for years to come.Recognizing that objective facts exist independent of, and sometimes in spite of, our subjective desires.
Humility – no one is perfect, and there is always something new to learn. Making mistakes is a symptomof life, and for this reason we need to be gracious to ourselves and to others.
Safety – assessing hazards and avoiding unnecessary risk to yourself and to others.
Competence – applying knowledge and skill to the effective solution of practical problems. Competenceincludes the ability to verify the appropriateness of your solutions and the ability to communicate so thatothers understand how and why your solutions work.
Diligence – exercising self-discipline and persistence in learning, accepting the fact there is no easy way toabsorb complex knowledge, master new skills, or overcome limiting habits. Diligence in work means the jobis not done until it is done correctly: all objectives achieved, all documentation complete, and all root-causesof problems identified and corrected.
Community – your actions impact other peoples’ lives, for good or for ill. Conduct yourself not just foryour own interests, but also for the best interests of those whose lives you impact.
Respect is the acknowledgement of others’ intrinsic capabilities, responsibilities, and worth. Everyone hassomething valuable to contribute, and everyone deserves to fully own their lives.
file eet_values
3
Course description
This course teaches the application of C-language programming to the experimental simulation of DCresistor circuits, time-delay networks, AC filter and polyphase circuits, digital logic functions, and analogamplifiers. All experiments employ scientific method: proposing falsifiable hypotheses, devising procedures,gathering data, analyzing results, and developing documentation. Students also apply fundamental circuitprinciples to the diagnosis of simulated and real faults in these same types of circuits. Mastery standardsapplied to all experimental and diagnostic steps guarantee attainment of learning outcomes.
Course learning outcomes
• Edit and write C/C++ procedural code to simulate DC resistor circuits, resistor-capacitor and resistor-inductor delay networks, passive filters, polyphase AC power circuits, digital logic functions, andamplifier circuits. (Addresses Program Learning Outcomes 2, 4, 6, 7)
• Troubleshoot faulted DC resistor circuits, transformer circuits, combinational logic networks, and multi-stage audio amplifier circuits from measurements taken at test points with circuit components andconnections hidden from view. (Addresses Program Learning Outcomes 4, 6)
• Articulate diagnostic reasoning while troubleshooting these same circuits. (Addresses Program LearningOutcomes 1, 3)
4
Required Tools, Supplies, and Software
Listed by IETTI course number and course type (Thy = theory, Exp = Experiments, Prj = Projects).
Semester 1 = IETTI-101 (Theory), 103 (Experiments), and 102 (Projects)Semester 2 = IETTI-104 (Theory), 112 (Experiments), and 105 (Projects)Semester 3 = IETTI-222 (Theory), 221 (Experiments), and 220 (Projects)Semester 4 = IETTI-223 (Theory), 225 (Experiments), and 106 (Projects)
Tool, Supply, or Software Thy Exp Prj Thy Exp Prj Thy Exp Prj Thy Exp Prjinstallation 101 103 102 104 112 105 222 221 220 223 225 106
$25 scientific calculator X X X X X X X X X X X XComplex number math functions X X
$300 personal computer X X X X X X X X X X X Xany OS, not tablet
$10 USB “flash” drive X X X X X X X X X X X X$50-$100 digital multimeter X X X X X X X X
$400 optional upgrade: Fluke 87-V + + + + + +$300 optional upgrade: Simpson 260 + + + + + +
$150 USB-based oscilloscope X X X X X X X Xe.g. Picoscope model 2204A$10 solderless breadboard X X X X X X X X$25 grounding wrist strap X X X X X X X X
$10 jeweler’s screwdriver set X X X X X X X X$10 wire strippers, 18-24 AWG X X X X X X X X
$10 needle-nose pliers X X X X X X X X$20 diagonal wire cutters X X X X X X X X
$10 alligator-clip jumper wires X X X X X X X X(package of at least ten)
$15 small flashlight X X X X X X X X$10 safety glasses X X X X X
$25-$100 soldering iron (pencil-tip), X X X X X30 Watts or less
$15 tube/spool of rosin-core solder X X X X X$0 software: schematic editor X X X X X X X X
$0 software: Notepad++ text editor X X X X$0 software: NGSPICE circuit sim. X X X X
$0 software: WSL X X X X(Windows Subsystem for Linux)$0 software: tshoot fault sim. X X X X
$15 microcontroller development kit X X Xand IDE software
$0 software: PCB layout editor X
5
Required Tools, Supplies, and Software
Scientific calculator – at minimum your calculator must perform trigonometric functions (sine, cosine,tangent, etc.), offer multiple memory registers, and display values in both scientific and “engineering”notations. I recommend the Texas Instruments model TI-36X Pro because it easily performs complex-number arithmetic necessary for AC circuit analysis and is inexpensive.
Personal computer – all course materials are available in electronic format and are free (most are alsoopen-source), making a portable computer extremely useful. The school provides personal computers foron-campus use, but having your own will enable you to work outside of school. Any operating system, anysize hard drive, any amount of RAM memory, and any screen size is appropriate. Useful features worthhigher cost include an RJ-45 Ethernet port and an EIA/TIA-232 (9-pin) serial port.
Multimeter – this is your first and most important electronic test instrument. At minimum it mustmeasure DC and AC voltage, DC and AC current (milliAmpere range), resistance, and “diode check”voltage drop. Useful features worth higher cost include microAmpere current measurement, true-RMS ACmeasurement (for second-semester courses and above), frequency measurement, capacitance measurement,and minimum/maximum value capture. Cost is a strong function of accuracy, frequency range, and safety(“Category” ratings for over-voltage exposure). The Fluke model 87-V is an excellent professional-gradechoice for digital multimeters, and the Simpson 260 is an excellent professional-grade choice for analogmultimeters. Note that Fluke offers a 25% educational discount for students.
Oscilloscope – once too expensive for student purchase, entry-level USB-based oscilloscopes now costless than a textbook. Pico Technology is an excellent brand, and their model 2204A comes with high-quality probes as well. Plugged into your personal computer using a USB cable, the Picoscope turns yourcomputer’s monitor into a high-resolution oscilloscope display. Features include two measurement channels,10 MHz bandwidth, built-in arbitrary waveform generator (AWG), ± 100 Volt over-voltage protection,digital “cursors” for precise interpretation of amplitude and frequency, meter-style measurement capability,Fast Fourier Transform algorithm for frequency-domain measurement, export ability to several graphicimage formats as well as comma-separated variable (.csv) files, and serial communications signal decoding.Together with your multimeter, solderless breadboard and Development Board (which you will construct inthe IETTI-102 Project course and is yours to keep) this forms a complete electronics laboratory for doingexperiments and projects outside of school.
Soldering – the equipment you purchase for soldering need not be expensive, if you purchase the rightsolder. For electronics work you must use rosin-core solder. Kester is an excellent brand, and you shouldavoid cheap imported solders. For lead-based solder, a 63% tin and 37% lead alloy (Sn63/Pb37) works verywell. A one-pound roll is likely more solder than you will need in these courses, so I recommend buying justa small tube or small roll. I recommend a fine-tipped soldering iron (15 Watts continuous power, althoughsome with adjustable temperature controls may have higher power ratings to get up to soldering temperaturemore quickly) and a solder diameter 0.031 inches or smaller for doing fine printed-circuit board work. Also,keep the tip of your soldering iron clean by wiping it against a damp sponge or paper towel when hot, andnot leaving it hot any longer than necessary.
Microcontroller – these courses are not brand- or model-specific, but the Texas Instruments MSP430 seriesis highly recommended for their powerful features, modern design, and programmability in multiple languages(assembly, C, C++, and Sketch). I particularly recommend the model MSP-EXP430G2ET “LaunchPad”development board (MSP430G2553IN20 microcontroller chip) with Code Composer Studio for the IDEsoftware.
6
Required Tools, Supplies, and Software
All software required for these courses is free, and some of it is open-source.
Schematic editor – this is used to draft schematic diagrams for circuits. A good one is TinyCAD, but thereare also web-based CAD tools such as circuitlab.com that are very effective and easy to use.
Text editor – this is used to create plain-text files, kind of like a word processor but lacking formattingfeatures such as typeface, font size, etc. It is absolutely necessary for writing code of any kind. Notepad++
is a very good editor, but others work well too.
NGSPICE – this is a modern adaptation of the venerable SPICE circuit simulator which uses a text-coded“netlist” rather than a visual schematic diagram to describe circuits. Very powerful, and with decadesof netlist examples from earlier versions of SPICE to use as references. The installer lacks sophistication,being nothing more than a compressed (zip) file that you unpack. Once installed, you should instruct yourcomputer’s operating system to automatically associate any files ending in the extension .cir with theNGSPICE executable file ngspice.exe so that all of your netlist files will appear with the NGSPICE iconand will automatically load into NGSPICE when double-clicked.
WSL – Windows Subsystem for Linux is a “virtual machine” Linux operating system that runs within theWindows operating system, giving you a command-line user environment mimicking that of a Unix operatingsystem. It is a free application from Microsoft, with instructions available from Microsoft on how to install.I recommend installing the “Debian” distribution of WSL. Once installed, you will use the sudo apt-get
install command to download all the necessary C/C++ development tools you will need to compile thetshoot program (see below).
tshoot – this is a specialized circuit-simulator program that inserts faults into circuits and tests your abilityto locate them. Must be run on a Unix-type operating system or within WSL.
IDE software – an “Integrated Development Environment” is a software package used to write code, andfor our purposes this would be code meant to run in a microcontroller. For the Texas Instruments MSP430series, the main IDE is called Code Composer Studio, and it supports programming in assembly language,C, and C++. A third-party add-on to Code Composer Studio called Energia supports programming in theSketch language, identical to that used by the popular Arduino microcontroller series.
file eet_tools
7
Grading standards for Experiment courses
Your grade for this course is based on percentage scores (in every calculation rounded down to whole-numbered values), with each category weighted as follows:
• Experiment scores = 25% (Note: all Experiments are mastery-based, which means they must beeventually completed at 100% competence in order to pass the course)
• Troubleshooting scores = 75% (Note: all Troubleshooting activities are mastery-based, which meansthey must be eventually completed at 100% competence in order to pass the course)
Please note the importance of completing all Experiments and all Troubleshooting activities on or beforetheir respective deadline dates. If any Experiment or Troubleshooting activity is incomplete by the end ofthe school day of the deadline date, it will receive a 0% score. If any Experiment or Troubleshooting activityis incomplete by the end of the last day of the course, you will earn a failing grade (F) for the course. AllExperiments and Troubleshooting activities must be complete by the end of the last day of the course toreceive a passing grade for the course.
Electronic submissions of Experiments and Troubleshooting activities are acceptable for full credit. Thestandards are just as high for electronic submissions as for face-to-face demonstrations. For Experiments,video documentation of you completing all objectives in their proper order will count as full credit. ForTroubleshooting activities, the fault must be random and all steps must be videorecorded in one seamlesstake, which limits electronic submissions of Troubleshooting activities to those designated as computersimulations (i.e. where the computer simulation software implements the random fault).
This course is based on experiments and troubleshooting activities, and does not have fixed start and stoptimes as is the case with instructor-facilitated theory sessions. However, your punctual and consistentattendance is important for your success, as these activities require significant time-on-task to complete.
If you must be late or absent, it is imperative that you contact your instructor as well as any classmates youmay be coordinating with so plans may be adjusted. It is still your responsibility to meet all deadlines.
A failing (F) grade will be earned for the entire course if any experiment or troubleshooting exercise is notcompleted on or before the deadline date, or for any of the following behaviors: false testimony (lying),cheating on any assignment or assessment, plagiarism (presenting another’s work as your own), willfulviolation of a safety policy, theft, harassment, sabotage, destruction of property, or intoxication. Thesebehaviors are grounds for immediate termination in this career, and as such will not be tolerated here.
file eet_grading_e
8
Experiment 01
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate all the properties of series networks regarding voltage,current, and resistance.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0003
9
Experiment 02
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate all the properties of parallel networks regarding voltage,current, and resistance.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0004
10
Experiment 03
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the principle of Kirchhoff’s Current Law in a DCseries-parallel resistor circuit. Your demonstration should validate KCL at multiple nodes in the circuit.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0005
11
Experiment 04
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the principle of Kirchhoff’s Voltage Law in a DC series-parallel resistor circuit. Your demonstration should validate KVL within multiple loops in the circuit, atleast one of them excluding the power source.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0006
12
Experiment 05
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the principle of voltage divider loading.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0007
13
Experiment 06
NAME: DUE DATE:
Write and execute a SPICE simulation to demonstrate the behavior of a series-parallel resistor circuit.You will find sample code with explanations in the “Gallery” chapter of the SPICE Modeling of Resistor
Circuits learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_spice_r.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0019
14
Experiment 07
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the behavior of a series-parallel resistor circuit.You will find sample code with explanations in the “Programming References” chapter of the Series-
Parallel Circuits learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_seriesparallel.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0020
15
Experiment 08
NAME: DUE DATE:
You may choose your own experiment, ideally one that will help you strengthen your understanding ofone or more foundational principles. One suggestion is to choose a concept misunderstood or misapplied ona previous assessment (e.g. a failed exam question).
Checklists for physical experiments and computer simulations appear on the following two pages. Yourchoice may be of either (or both!) types.
16
Checklist for physical experiment
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
17
Checklist for computer simulation
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
18
Checklist for microcontroller-based experiment
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used© Cites any sampled source code and properly credits that code’s author
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes explanations of any errors and corrections© Describes lessons learned from this experiment
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain circuit or code concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values or code and predict effects)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose/debug)→ e.g. Other (e.g. redesign to achieve same objective with different components/code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0002
19
Experiment 09
NAME: DUE DATE:
Write and execute a C/C++ simulation to characterize the voltage/current relationship for a diode.You will find sample code with explanations in the “Programming References” chapter of the PN
Junctions and Diodes learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_pn.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0021
20
Experiment 10
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the principle of inverse-exponential growth and decayin a resistor-capacitor network.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0008
21
Experiment 11
NAME: DUE DATE:
Write and execute a SPICE simulation to demonstrate inverse-exponential growth and decay for voltageand current in a resistor-capacitor network.
You will find sample code with explanations in the “Gallery” chapter of the SPICE Modeling of Inductiveand Capacitive Circuits learning module:
http://ibiblio.org/kuphaldt/socratic/model/mod_spice_lc.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0023
22
Experiment 12
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate inverse-exponential growth and decay for voltageand current in a resistor-capacitor network.
You will find sample code with explanations in the “Programming References” chapter of the Capacitorsand Capacitive Circuits learning module:
http://ibiblio.org/kuphaldt/socratic/model/mod_capacitor.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0024
23
Experiment 13
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate inverse-exponential growth and decay for voltageand current in a resistor-inductor network.
You will find sample code with explanations in the “Programming References” chapter of the Inductorsand Inductive Circuits learning module:
http://ibiblio.org/kuphaldt/socratic/model/mod_inductor.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0025
24
Experiment 14
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of an astable 555 timer circuit.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0009
25
Experiment 15
NAME: DUE DATE:
You may choose your own experiment, ideally one that will help you strengthen your understanding ofone or more foundational principles. One suggestion is to choose a concept misunderstood or misapplied ona previous assessment (e.g. a failed exam question).
Checklists for physical experiments and computer simulations appear on the following two pages. Yourchoice may be of either (or both!) types.
26
Checklist for physical experiment
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
27
Checklist for computer simulation
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
28
Checklist for microcontroller-based experiment
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used© Cites any sampled source code and properly credits that code’s author
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes explanations of any errors and corrections© Describes lessons learned from this experiment
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain circuit or code concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values or code and predict effects)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose/debug)→ e.g. Other (e.g. redesign to achieve same objective with different components/code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0002
29
Experiment 16
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of a logic circuit built fromelectromechanical relays, implementing a logic function of your choosing (e.g. AND, OR, NAND, NOR,XOR, etc.).
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0011
30
Experiment 17
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of an IC logic gate, implementing a logicfunction of your choosing (e.g. AND, OR, NAND, NOR, XOR, etc.).
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0012
31
Experiment 18
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the behavior of a simple logic function.You will find sample code with explanations in the “Programming References” chapter of the Basic
Principles of Digital learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_digital.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0027
32
Experiment 19
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the behavior of a combinational logic circuit.You will find sample code with explanations in the “Programming References” chapter of the
Combinational Logic learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_comblogic.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0028
33
Experiment 20
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the behavior of a latch or flip-flop circuit.You will find sample code with explanations in the “Programming References” chapter of the Latching
Logic learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_latchlogic.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0029
34
Experiment 21
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate an arbitrary truth table implemented in IC gate logic.The truth table will have three inputs and be randomly assigned by the instructor. The gate circuit mustdrive a DC load requiring more current than the final gate alone can source or sink.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0013
35
Experiment 22
NAME: DUE DATE:
You may choose your own experiment, ideally one that will help you strengthen your understanding ofone or more foundational principles. One suggestion is to choose a concept misunderstood or misapplied ona previous assessment (e.g. a failed exam question).
Checklists for physical experiments and computer simulations appear on the following two pages. Yourchoice may be of either (or both!) types.
36
Checklist for physical experiment
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
37
Checklist for computer simulation
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
38
Checklist for microcontroller-based experiment
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used© Cites any sampled source code and properly credits that code’s author
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes explanations of any errors and corrections© Describes lessons learned from this experiment
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain circuit or code concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values or code and predict effects)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose/debug)→ e.g. Other (e.g. redesign to achieve same objective with different components/code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0002
39
Experiment 23
NAME: DUE DATE:
Write and execute a C/C++ simulation to plot a sinusoidal waveform to the console.You will find sample code with explanations in the “Programming References” chapter of the Phasor
Mathematics learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_phasor.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0022
40
Experiment 24
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of a passive filter circuit of your ownchoosing. The instructor will randomly select the filter’s cutoff frequency.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0010
41
Experiment 25
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the behavior of a passive filter circuit.You will find sample code with explanations in the “Programming References” chapter of the Elementary
Filter Circuits learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_filters.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0030
42
Experiment 26
NAME: DUE DATE:
Write and execute a SPICE simulation to demonstrate the behavior of a wye or delta polyphase circuit.You will find sample code with explanations in the “Gallery” chapter of the SPICE Modeling of Power
Circuits learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_spice_power.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0026
43
Experiment 27
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the behavior of a three-phase AC circuit.You will find sample code with explanations in the “Programming References” chapter of the Polyphase
AC learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_polyphase.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0031
44
Experiment 28
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the principle of a harmonic series synthesizingdifferent waveshapes.
You will find sample code with explanations in the “Programming References” chapter of the Frequency-Domain Analysis learning module:
http://ibiblio.org/kuphaldt/socratic/model/mod_freqdomain.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0032
45
Experiment 29
NAME: DUE DATE:
Write and execute a C/C++ simulation to demonstrate the plotting of an arbitrary waveform.You will find sample code with explanations in the “Programming References” chapter of the Phasor
Mathematics learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_phasor.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0034
46
Experiment 30
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of a single-transistor BJT amplifier circuit.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0014
47
Experiment 31
NAME: DUE DATE:
Write and execute a SPICE simulation to demonstrate the behavior of a BJT amplifier circuit.You will find sample code with explanations in the “Gallery” chapter of the SPICE Modeling of Amplifier
Circuits learning module:http://ibiblio.org/kuphaldt/socratic/model/mod_spice_amp.pdf
Pre-simulation objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]
© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of simulated circuit in full detail (i.e. everything you will simulate)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Cites any sampled source code and properly credits that code’s author
Simulation run:⊙All pre-simulation objectives must be certified complete before running your code
Post-simulation objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Includes text or screenshots of simulation results© Lists final version of source code
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Includes any programming errors made (i.e. errors preventing the program from running)© Includes any simulation errors encountered (i.e. incorrect simulation results)© Describes lessons learned from this simulation
• Challenge question: [Attempts = ] [Completed = ]→ e.g. Conceptual (explain programming concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter parameter(s) and predict new program behavior)→ e.g. Diagnostic (explain effects of errors, choose effective methods to debug)→ e.g. Other (e.g. redesign to achieve same objective with different code)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated edits and simulation runs are expected and will not count as additional attempts. A zero scorewill result if (1) the simulation is run without all pre-simulation objectives certified; or (2) you copy anyoneelse’s data or work. Note that it is proper to sample some (but not all!) of your source code from previoussimulations of your own or from others, so long as it is properly cited.
You must answer the challenge question without aid from any external information source.
file we_0033
48
Experiment 32
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of a multi-transistor audio amplifiercircuit.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0015
49
Experiment 33
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the principle of achieving a precise voltage gain usingan operational amplifier.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0016
50
Experiment 34
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of an operational amplifier circuit with adiscrete-transistor power stage driving a load that would be too heavy for the opamp itself. Like any otheropamp-based amplifier, this should exhibit a precise voltage gain.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0017
51
Experiment 35
NAME: DUE DATE:
Plan and conduct an experiment to demonstrate the behavior of a sinusoidal oscillator circuit, includingboth time-domain and frequency-domain analyses of its output signal.
Pre-run objective checklist:• Written hypothesis: [Attempts = (ungraded) ] [Completed = ]© Makes clear and verifiable prediction(s), quantitative if at all possible© Shows schematic diagram of experimental circuit in full detail (i.e. everything you will build)© Shows all supporting mathematical work
• Written procedure: [Attempts = ] [Completed = ]© Clearly states criteria for either accepting or rejecting the hypothesis (i.e. “How will we know?”)© Identifies components and test equipment to be used
• Risk assessments: [Attempts = ] [Completed = ]© Identifies all personal risks (e.g. shock, burns, inhalation) and methods to mitigate© Identifies all risks to hardware (e.g. citing maximum ratings from datasheets where applicable) andmethods to mitigate
Experimental run:⊙All pre-run objectives must be certified complete before running the experiment⊙All safety protocols must be followed
© Running experiment shown to the instructor, either live or recorded
Post-run objective checklist:• Data collected: [Attempts = ] [Completed = ]⊙
All data must be original (i.e. no plagiarism)© Records data with full precision (i.e. no rounding), sketches or screenshots where appropriate
• Written analysis: [Attempts = ] [Completed = ]⊙All conclusions must be your own (i.e. no plagiarism)
© Explains how the collected data either confirms or refutes the hypothesis© Calculates error (Error % of value = Measured−Predicted
Predicted× 100%) and proposes sources of error
© Describes lessons learned from this experiment• Challenge question: [Attempts = ] [Completed = ]
→ e.g. Conceptual (explain circuit concepts, correct a misconception, qualitative analysis)→ e.g. Quantitative (alter component values and re-calculate)→ e.g. Diagnostic (explain effects of faults, choose effective tests to diagnose)→ e.g. Other (e.g. redesign using different components, research datasheet parameters)
The instructor will have you demonstrate each completed objective, in order from top to bottom. The totalnumber of completed objectives divided by the total number of attempts made yields the percentage score.Repeated experimental runs are expected and will not count as additional attempts. A zero score will resultif (1) the experiment is run without all pre-run objectives certified; or (2) any safety standard is violated(e.g. touching energized conductors); or (3) you copy anyone else’s data or work.
You must answer the challenge question without aid from any external information source.
file we_0018
52
Troubleshoot 36
NAME: DUE DATE:
Troubleshoot a computer-simulated fault within an unloaded voltage divider circuit. The schematicdiagram for this circuit as simulated in the tshoot troubleshooting simulator software is shown below:
+−V1
Fuse
F1
Toggle switch
S1
R1 Resistor
Resistor
Resistor
R2
R3
TP0
TP1TP2
TP3
TP4
TP5
Circuit #001
Voltagesource
Nominal component values:
V1 = _______ Volts +/- _____ %
R1 = _______ Ohms +/- _____ %
R2 = _______ Ohms +/- _____ %
R3 = _______ Ohms +/- _____ %
The tshoot software randomly selects the fault and the circuit component values for you, after whichyou will have a limited amount of time to perform measurements and other tests. The software tracks eachdiagnostic step you take, the amount of time you needed to take each step, and assigns a “cost” to each stepbased on its complexity and risk. A successful troubleshooting exercise consists of both correctly identifyingthe location and nature of the fault, as well as logically defending the necessity of each diagnostic step.Incorrect fault identification, unnecessary steps, and/or incorrect defense of any step will result in a failedattempt. “Par” scores exists for the number of steps taken, time, and cost of correctly diagnosing the fault.You must achieve at par or better.
Troubleshooting is mastery-based, meaning every one must be competently completed by the due datein order to pass the course, and you will be given multiple opportunities to re-try if you do not pass on thefirst attempt. Each re-try begins with another randomized fault. Scoring is based on the number of attemptsnecessary to successfully troubleshoot a circuit (e.g. 1 attempt = 100% ; 2 attempts = 80% ; 3 attempts= 60% ; 4 attempts = 40% ; 5 attempts = 20% ; 6 or more attempts = 0%). Troubleshooting assessmentsare closed-book and closed-note. You are encouraged to practice using the tshoot software, being free andreadily available for your use.
file we_1002
53
Troubleshoot 37
NAME: DUE DATE:
Troubleshoot a computer-simulated fault within a loaded voltage divider circuit. The schematic diagramfor this circuit as simulated in the tshoot troubleshooting simulator software is shown below:
TP0
TP1 TP2
TP3 TP4
TP5 TP6
Circuit #004
V1+−
R1 R2(Load 1) (Load 2)
R3
R4
R5
S1
S2
F1
Nominal component values:
V1 = _______ Volts +/- _____ %
R1 = _______ Ohms +/- _____ %
R2 = _______ Ohms +/- _____ %
R3 = _______ Ohms +/- _____ %
R4 = _______ Ohms +/- _____ %
R5 = _______ Ohms +/- _____ %
The tshoot software randomly selects the fault and the circuit component values for you, after whichyou will have a limited amount of time to perform measurements and other tests. The software tracks eachdiagnostic step you take, the amount of time you needed to take each step, and assigns a “cost” to each stepbased on its complexity and risk. A successful troubleshooting exercise consists of both correctly identifyingthe location and nature of the fault, as well as logically defending the necessity of each diagnostic step.Incorrect fault identification, unnecessary steps, and/or incorrect defense of any step will result in a failedattempt. “Par” scores exists for the number of steps taken, time, and cost of correctly diagnosing the fault.You must achieve at par or better.
Troubleshooting is mastery-based, meaning every one must be competently completed by the due datein order to pass the course, and you will be given multiple opportunities to re-try if you do not pass on thefirst attempt. Each re-try begins with another randomized fault. Scoring is based on the number of attemptsnecessary to successfully troubleshoot a circuit (e.g. 1 attempt = 100% ; 2 attempts = 80% ; 3 attempts= 60% ; 4 attempts = 40% ; 5 attempts = 20% ; 6 or more attempts = 0%). Troubleshooting assessmentsare closed-book and closed-note. You are encouraged to practice using the tshoot software, being free andreadily available for your use.
file we_1003
54
Troubleshoot 38
NAME: DUE DATE:
Troubleshoot a fault within a combinational logic circuit. This circuit shall be constructed in such amanner that all circuit components and simulated faults must be hidden from view (e.g. covering it up witha box or towel) but test points will be available for contact with a multimeter’s probes. A schematic diagramshowing the circuit and its test points will be allowed for use during the troubleshooting exercise.
The circuit shall have three inputs and drive a DC load requiring more current than the final gate alonecan source or sink. Possible faults include:
• Any resistor failed open• Any transistor terminal failed open• Any gate output failed high• Any gate output failed low
The instructor will either set up or supervise other students setting up a random fault hidden fromview. You will then have a limited amount of time to independently perform measurements and other testswhile under the continuous observation of the instructor. A successful troubleshooting exercise consists ofboth correctly identifying the location and nature of the fault, as well as logically defending the necessity ofeach diagnostic step. Incorrect fault identification, unnecessary steps, and/or incorrect defense of any stepwill result in a failed attempt. Your only access to the faulted circuit will be via the test points, and onlyone unpowered test will be permitted.
Troubleshooting is mastery-based, meaning every one must be competently completed by the due datein order to pass the course, and you will be given multiple opportunities to re-try if you do not pass on thefirst attempt. Each re-try begins with another randomized fault. Scoring is based on the number of attemptsnecessary to successfully troubleshoot a circuit (e.g. 1 attempt = 100% ; 2 attempts = 80% ; 3 attempts =60% ; 4 attempts = 40% ; 5 attempts = 20% ; 6 or more attempts = 0%). Troubleshooting assessments areclosed-book and closed-note.
file we_1005
55
Troubleshoot 39
NAME: DUE DATE:
Troubleshoot a fault within a passive low-pass or high-pass filter circuit. This circuit shall be constructedin such a manner that all circuit components and simulated faults must be hidden from view (e.g. covering itup with a box or towel) but test points will be available for contact with a multimeter’s probes. A schematicdiagram showing the circuit and its test points will be allowed for use during the troubleshooting exercise.
The circuit shall be entirely passive and consist only of resistor(s) and capacitor(s), energized by asignal generator with variable frequency and variable amplitude adjustments provided, and powering a high-resistance resistive load. Possible faults include:
• Any cable failed open• Any cable failed shorted• Any component failed open• Any component failed shorted• Any component value altered
The instructor will either set up or supervise other students setting up a random fault hidden fromview. You will then have a limited amount of time to independently perform measurements and other testswhile under the continuous observation of the instructor. A successful troubleshooting exercise consists ofboth correctly identifying the location and nature of the fault, as well as logically defending the necessity ofeach diagnostic step. Incorrect fault identification, unnecessary steps, and/or incorrect defense of any stepwill result in a failed attempt. Your only access to the faulted circuit will be via the test points, and onlyone unpowered test will be permitted.
Troubleshooting is mastery-based, meaning every one must be competently completed by the due datein order to pass the course, and you will be given multiple opportunities to re-try if you do not pass on thefirst attempt. Each re-try begins with another randomized fault. Scoring is based on the number of attemptsnecessary to successfully troubleshoot a circuit (e.g. 1 attempt = 100% ; 2 attempts = 80% ; 3 attempts =60% ; 4 attempts = 40% ; 5 attempts = 20% ; 6 or more attempts = 0%). Troubleshooting assessments areclosed-book and closed-note.
file we_1004
56
Troubleshoot 40
NAME: DUE DATE:
Troubleshoot a fault within a multi-stage audio amplifier circuit. This circuit shall be constructed insuch a manner that all circuit components and simulated faults must be hidden from view (e.g. covering itup with a box or towel) but test points will be available for contact with a multimeter’s probes. A schematicdiagram showing the circuit and its test points will be allowed for use during the troubleshooting exercise.
The circuit shall contain at least two amplification stages and power a resistive load. The signal sourcemay be a signal generator with adjustable frequency and amplitude, or something else appropriate such asa microphone. Possible faults include:
• Any cable failed open• Any cable failed shorted• Any component failed open• Any component failed shorted• Any component value altered• DC power source failed
The instructor will either set up or supervise other students setting up a random fault hidden fromview. You will then have a limited amount of time to independently perform measurements and other testswhile under the continuous observation of the instructor. A successful troubleshooting exercise consists ofboth correctly identifying the location and nature of the fault, as well as logically defending the necessity ofeach diagnostic step. Incorrect fault identification, unnecessary steps, and/or incorrect defense of any stepwill result in a failed attempt. Your only access to the faulted circuit will be via the test points, and onlyone unpowered test will be permitted.
Troubleshooting is mastery-based, meaning every one must be competently completed by the due datein order to pass the course, and you will be given multiple opportunities to re-try if you do not pass on thefirst attempt. Each re-try begins with another randomized fault. Scoring is based on the number of attemptsnecessary to successfully troubleshoot a circuit (e.g. 1 attempt = 100% ; 2 attempts = 80% ; 3 attempts =60% ; 4 attempts = 40% ; 5 attempts = 20% ; 6 or more attempts = 0%). Troubleshooting assessments areclosed-book and closed-note.
file we_1006
57
Lab clean-up 41
NAME:
This list represents all of the major work-items that must be done at every semester’s end to preparethe lab space for the upcoming semester. Each student will have at least one task assigned to them.
Non-technical tasks
© Thoroughly clean whiteboard(s)
© Clean floor of all debris
© Clean all workbench surfaces
© Organize all cables, cords, test leads neatly into their storage locations
© Clean all electrical panel and test equipment surfaces
© Note any depleted bins (electronic components, threaded fasteners, cables, etc.)→ Report to instructor for re-ordering in preparation for next semester
Technical tasks
© Check fastener storage bins to ensure no fasteners are misplaced
© Check digital IC storage bins to ensure no ICs are misplaced
© Check resistor storage bins to ensure no resistors are misplaced
© Check inductor/transformer storage bins to ensure no inductors or transformers are misplaced
© Check capacitor storage bins to ensure no capacitors are misplaced
© Test oscilloscopes for basic functionality (e.g. all channels functional, all vertical sensitivity settingsfunctional, all timebase settings functional, triggering functions properly)
© Test signal generators for basic functionality (e.g. all waveshapes functional, magnitude adjustmentfunctional, frequency adjustment(s) functional)
© Test power supplies for basic functionality (e.g. voltage adjustments functional, current limits functional,voltage/current meters functional)
© Test benchtop multimeters for basic functionality (e.g. all voltage ranges functional, all current rangesfunctional, overcurrent fuse good)
© Test permanently-installed demonstration projects for basic functionality (read instructions on each!)
file we_2000
58
General Troubleshooting Advice
All electronic circuit faults fall into at least one of these categories:
• Connection fault – the components are not properly connected together.• Design flaw – the circuit cannot work because something about it is incorrectly designed.• Lack of power/signal or poor quality – the power and/or signal source is “dead” or “noisy”• Component fault – one or more components is faulty.• Test equipment – either the test equipment itself is faulty, or is not being used appropriately.
Of these categories, the one causing more problems for students initially learning about circuits than allthe others is the first: connection fault. This is because the ability to translate an idea and/or a schematicdiagram into a physical circuit is a skill requiring time to develop. Many such problems may be avoided by(1) drawing a complete schematic of what you intend to build before you build it, (2) marking that schematicto show which connections have been made and which are left to make, and (3) using an ohmmeter (notyour eyes!) to verify that every pair of points which should be connected are connected and that no pointswhich should be electrically distinct from each other are in fact electrically common.
Troubleshooting strategies
• Verify the symptom(s) – Always check to see that the symptom(s) match what you’ve been told byothers. Even if the symptoms were correctly reported, you may notice additional (unreported) symptomshelpful in identifying the fault.
• Verify good power quality – Is the source voltage within specifications, and relatively free of “ripple”and other noise?
• Check signals at component terminals – Use an oscilloscope or multimeter to check for propersignals at each of the component pins, to see if each one matches your expectations. An importantcheck, especially for integrated circuits, is whether the measured output signal(s) are appropriate forthe measured input signal(s).
• Simply the system – If possible, re-configure the circuit to be as simple as possible, because complexitymakes faults harder to find.
• Swap identical components – If particular a component is suspected of being faulty, and you areable to swap another (identical) component for it, do so to see whether or not the problem moves withthe old component. If so, that component is to blame; if not, the problem lies elsewhere.
• Always look for Root Cause(s) – don’t declare success simply by finding the proximate (i.e. themost direct) cause, but continue your search to find what design flaw, circumstance, or other distalcause led to it.
file eet_troubleshooting
59