CS 457/557  Winter Quarter 2024
Test #1 Review
This page was last updated: January 16, 2024
Test dates and times:
Test #1 will go live at 12:01 AM PT on Wednesday, February 7.
It will close at 23:59 PT on Sunday, February 11.
This gives you 119 hours, 58 minutes in which to take a 1hour test.
Test Information:

This will be a multiple choice test cast as a Canvas "Quiz".

There will be 40 questions, worth 2.5 points each.

You will have 60 minutes to complete it.
Once you start, you need to finish.
Canvas does not allow you to pause, leave, then come back and resume.

The test is open notes and closed friends.
Warning! "Open Notes" is not the same as "I don't need to study for it"!
You will run out of time if you have to look up in the notes every one of the questions.

Clearly, I cannot stop you from accessing information on the Internet.
However, the test has been written against our class notes.
If you miss a particular question, any protest of the form "But somethingsomething.com said that..." will be ignored.

You are responsible for
 what is in the handouts
 what was said in class and the videos, including the Live Lectures
 what was covered on the quizzes
 what you have done in the projects
The test can cover any of the following topics:

Course Infrastructure:
Project Turnin procedures
Bonus Days
[ History of Shaders won't be on the test ]

Homogeneous coordinates
What that means
What are they good for?
What the projection matrices to besides projection

Coordinate systems: Model Coordinates (MC), World/Eye Coordinates (EC), Screen Coordinates (SC)

GLSL:
GLSL vertex and fragment shaders.
How they work.
Where they fit into the overall OpenGL pipeline.
What the vertex shader replaces. What it doesn't.
What the fragment shader replaces. What it doesn't.
Where the GLSL shader compiler lives (for us, in the driver)
Uniform, Out/In, and Pervertex Attribute variables.
What they do.
In what circumstances they are used.
Builtin GLSL variables
Builtin GLSL functions.
[ You don't need to know all of them 
just the major ones we've been using in class and are
used in the projects. ]

GLSL coordinates:
Model: gl_Vertex
Eye/World: gl_ModelViewMatrix * gl_Vertex
What effects you can get with each (stripes example)
Clip: gl_ModelViewProjectionMatrix * gl_Vertex
NDC: ( gl_ModelViewProjectionMatrix * gl_Vertex ) divided by the resulting w
Sent down to the rest of the pipeline: gl_Position

Normal vector transformation:
gl_NormalMatrix * gl_Normal

Shader patterns:
Deciding what to key off of
Stripes: keying off Cartesian (x,y,z) coordinates vs. keying off texture (s,t) coordinates
Stripes: keying off a quantity in Model Coordinates vs. a quantity in World/Eye Coordinates

The GLSL discard operator
What happens if you use gl_FragColor.a = 0. instead?

The GLSL API for using shaders:
Reading, creating, compiling, attaching, linking
Passing in uniform and attribute variables
Enabling (Use'ing) a shader program
UnUse'ing shader programs (and thus returning to FixedFunction OpenGL)
The fact that OpenGLES (mobile, web) has no fixedfunction pipeline
[You don't need to know the names of the OpenGL functions that are used to create the shader program.]
Our glslprogram C++ class

Mixing / Blending:
The step( ) and smoothstep( ) functions
Creating our own smoothpulse( ) from two smoothstep( ) functions
The mix( ) function
How to blur the ellipse boundaries,

Fun With One:
different ways to manipulate the path from 0. to 1.
[ The exact equations won't be on the test. ]

Morphing:
cowtosphere, cowtocube, spheretodisk
Why not cowtodino?
 Stripes, Rings, Dots:
Getting stripes
Applying a sine wave to the stripes
Getting rings
Getting circles
Getting ellipses

GLSL Textures:
Types (1D, 2D, 3D, Cube)
One overloaded texture( ) function
All types return a vec4. They are "typed" by what gets passed in to index into the texture.
Texture units
Sampler variables (e.g., sampler2D, sampler3D, samplerCube)

Displaceent textures:
Yes, you can read a texture from a vertex shader in order to displace vertices

Noise:
What does it mean that noise needs to be coherent?
What does it mean that noise needs to be repeatable?
Why it can't really be random
What's wrong with Positional noise?
How Gradient noise (also called Perlin noise) is produced
Noise amplitude, Noise frequency
Octaves (twice the frequency, half the amplitude)
"Turbulence"
How to use noise and why
glman's noise: "baked" into a lookup texture
[ The noise equations won't be on the test. ]

Noise terrain:
Using noise to create a terrain surface

Bumpmapping:
General idea
Advantages of bumpmapping over displacementmapping
Advantages of displacementmapping over bumpmapping

Height Field BumpMapping (e.g., Oregon terrain):
What is a height field? (≈ A pin box.)
Storing and accessing the height field data
Tangent Vectors
Computing the normal

Equationbased BumpMapping (e.g., ripples)
The calculus of Tangent Vectors (dzdx, dzdy)
Computing the normal

Nonlinear uses for the vertex shader:
Dome projection [you are not responsible for the equations]
Hyperbolic Geometry [you are not responsible for the equations]

Lighting: Ambient, Diffuse, Specular
Pervertex vs. Perfragment lighting  what gets interpolated in each
Flat vs. Smooth lighting  what each looks like
The flat GLSL keyword for in and out variables
How the disco ball lighting works and the fact that it only uses one light source

Cubemapping:
Cubemapping texture consists of 6 images in one "texture"
How an (s,t,p) vec3 gets turned into an (r,g,b) from a cube map
Reflection. (reflect function)
How the cubemapped version differs from "real" reflection (2 things)
Refraction. (refract function)
How the cubemapped version differs from "real" refraction (3 things)
The test can cover any of the following Projects:

Project 1:
Elliptical Dots: ellipse equations, finding out what checker you are in, smoothstep( ) function, mix( ) function

Project 2:
Noisy Displaced Elliptical Dots: use of noise, glman's way of giving you noise, how apply noise to ellipse boundaries

Project 3:
Displacement Mapping, BumpMapping, and Lighting: computing derivatives, tangent vectors,
computing the normal, perturbing the normal