Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit

Computer Graphics Assignments

Winter Semester 2023/2024

With  these  exercises,  you  will  implement  various  features  and  functionalities  in  JavaScript  and WebGL. We provide you with a small framework (cg-framework.zip) that already implements a basic 3D renderer in WebGL, which you will extend over the course of the semester.

The framework uses the following external libraries:

./js/twgl.js  –  TWGL: A Tiny WebGL helper library (https://twgljs.org/):  This  library helps us to reduce the boilerplate code necessary to set up WebGL rendering in JavaScript and provides several other useful utilities, such as functions for vectors and matrices. Please refer to the documentation and examples for further details.

./js/gl-matrix.js glMatrix (https://glmatrix.net/): This library provides all the essential matrix and vector operations you will typically need and is designed to interoperate well with WebGL applications.

./js/tweakpane-4.0.0.js    –    Tweakpane (https://cocopon.github.io/tweakpane/):   This library provides us with simple GUI elements for modifying parameters and other settings. The framework already provides a few GUI elements for manipulating the existing settings, but you should expose any additional parameters that you introduce.

In addition, we already provide you with a few useful pieces of code that will help you get started quickly.  You  normally  should  not  need  to  make  changes  to  any  of  these  files. If  you  do  make changes, then make sure that you also submit them with your solution!

./js/arcball.js: Implements a so-called arcball controller for camera manipulation.

./js/parseobj.js: Implements a loader for the Wavefront OBJ model file format.

./js/storage.js: Implements  a  persistent  object  store  that  allows  persistence  of  settings (including loaded models) between sessions. This allows you to reload the page when you make a change and immediately observe the results

./js/util.js: Basic setup code (e.g. GUI), so your HTML is less cluttered.

The main entry points into the frameworks are the following files:

./helloworld.html: a simple WebGL test setup.

./meshrenderer.html: starting point for assignment 1 to 3.

./raytracer.html: starting point for assignment 4.

Prerequisites

You will need a GPU and browser with WebGL 2 support. All up-to-date modern browsers should be fine (Chrome, Firefox, Edge, Safari) and with respect to GPUs, most modern laptops with integrated graphics should be fine, but you might need to update your graphics drivers.

WebGL 2 support can be verified here at https://get.webgl.org/webgl2/. In case something does not work, this will also refer you to the support page for your browser.

Development Setup

The HTML file can be run locally and does not require a web server, so in principle you can edit it with       any       text       editor.       However,       we       recommend       using      Visual       Studio       Code (https://code.visualstudio.com/),  which   is   available   on   all  major  platforms.  This  is  also  the environment that will be used in the exercise sessions. You are of course free to use a custom setup, but this will make it more difficult for us to help you with troubleshooting.

The following extensions for VSCode are also helpful and therefore recommended:

Live Preview: This will give you a live preview window that will automatically reload when you  edit  the  code,  providing  you with  instant feedback  (several  other extensions also do similar things, but this one will open the preview window as a VSCode tab).

WebGL GLSL  Editor: This  provides  you  with  syntax  highlighting  for  GLSL  shader  files (while there are many other extensions with similar functionality, this one works best for our setup withinline shaders in the HTML file).

You  can  install  them  by   opening  the   extensions  tab  in  the  activity  bar  and  searching  for  the extension name exactly as given above.

Basic Functionality

Out of the box, our framework   (in meshrenderer.html) already provides us with everything needed to   render    (although   quite   primitively)    common   models    in   the   Wavefront   OBJ   file   format (https://en.wikipedia.org/wiki/Wavefront_.obj_file). We also provide you with a set of test models (cg-models.zip) that you can use during your exercises, but many (although certainly not all) files that you can find on the web should work in principle.

The load button in the “General” tab of the “Parameters” panel will allow you to load geometry (OBJ files) as well as their associated materials  (MTL files) and textures  (images files, typically JPG or PNG). After loading, the renderer will then display the unshaded model.

You can control the camera with mouse, touch, and keyboard controls. It can be reset to its default settings using the “Reset View” button in the “General” tab.

Mouse Controls:

Rotate: hold left button

Zoom: scroll wheel or hold right button (move up = zoom in, move down = zoom out)

Pan: hold middle button

Touch Controls:

Rotate: one-finger touch & drag

Zoom: two-finger pinch

Pan: two-finger touch & drag

Keyboard Controls:

Cursor keys: rotate in 30 degree steps

Page Up/Page Down: zoom in/out by 20%

Guidelines

●   Document   all   the    assignments   that  you   solved   using   the   supplied README.md file (documentation can be in German or English). Focus on which assignments you solved and give usage instructions (e.g., how your GUI elements operate).

●    Make  sure  you  document any  extra features that were not required. Even if they are not listed  in  the  assignments,  you  may  get  extra  points  for  them.  Note:  the  total  number  of points cannot exceed  100. If you are unsure whether something qualifies for extra points, just ask us.

●    Especially if this is your first time doing any graphics programming, things might be a bit overwhelming at first. In the exercises (and of course the lecture), you will receive a detailed explanation of how things work and how you can approach the exercises. However, don’t be afraid to experiment – you can always revert to the default code.

●   WebGL   2   Fundamentals   (https://webgl2fundamentals.org/)   is  a  great  introduction  to WebGL programming. In fact, the author is also the author of TWGL, the helper library we use  in  our  project,  and  our  OBJ  model  loader  is  just  a  small  extension  of theirs,  so  the articles there will probably answer many of your questions.

●   Under normal circumstances, you should never need to modify any of the files in the ./js/ folder. All  required  functionality  can  be  implemented  in  the  HTML  file.  For  instance, the source code already contains an example of how to add custom user interface components and input bindings.

●    Despite this, we strongly encourage you to read all the code that we supply (except for the 3rd party libraries TWGL and Tweakpane), as it will help you to understand how everything works. If something is unclear, just ask us.

●   If you think that it is not possible to implement the envisioned functionality in the HTML file, it is allowed to modify the existing files, although heavy changes are discouraged. It’s still a good idea to ask us for feedback. In any case, please make sure to include all modified files in the final submission.

●   This goes without saying, but: DO NOT PLAGIARIZE CODE! We know all the sources  (in fact, we point you to many of them), so copy/paste will be detected. This does not mean that you cannot learn from tutorials and examples, but any submitted code should be yours.

Assignments (100 points)

There is only a single deadline for handing in all assignments. Late submissions will not be accepted. The deadline is:

January 31, 2024, 23:59

The  submission  will  be  handled  with  Stud.IP.  Further  details  on  the  process  will  be  announced during the semester.

Assignment 1: Illumination (10 points)

Using meshrenderer.html as  a  starting  point,  perform  the  following tasks and document them in README.md.

(a)  Implement  illumination  using  the  Phong  illumination  model with  a point light source by adapting the vertex and fragment shader. The illumination model should be evaluated per fragment. (5 points)

(b) Provide a user interface for modifying the light source parameters including its position, as well as ambient, diffuse, and specular contributions.   (5 points)

Up to  5 extra points will be awarded for documented additional features, e.g., support for multiple light sources, different light source types, more advanced illumination models, etc.

Hints:

○   Think  about  which  coordinate  space  lighting  should  be  performed  (multiple  options  are possible) and perform the corresponding transformations.

○    Note  that  when  a  surface  is  transformed  with  a  matrix  M,  the  normal  of  the  surface  is transformed  by  a  matrix  transpose(inverse(M))  –  this  is  often  referred  to  as the inverse transpose,    even    though     it’s    the    transpose    of    the     inverse    of    the    matrix     (see

https://www.scratchapixel.com/lessons/mathematics-physics-for-computer-graphics/geo metry/transforming-normals.html)

Assignment 2:Texturing (30 points)

Using your implementation in meshrenderer.html from assignment 1 as a starting point, perform the following tasks and document them in README.md.

(a)  Implement    texture    mapping.    Use    the    model’s    textures    (ambientMap,    diffuseMap, specularMap) as specified in the material properties. (5 points)

(b) Implement  tangent-space  normal  mapping  using  the bumpMap material property to add extra surface details. (10 points)

(c)  Implement  procedural  bump  mapping  using a suitable function.  Provide GUI elements to enable/disable and control the parameters of the bump function. (15 points)

Up to 10 extra points will be awarded for documented additional features, e.g., environment mapping, displacement mapping, etc.

Hints:

Our  OBJ loader already computes the tangents needed for normal mapping in assignment 2b,  the bitangent can be easily obtained from the normal and tangent.

In Wavefront MTL files, there is no separate field for normal maps, and most models simply use map_Bump, so you can assume that this is a normal map. The “windmill”, “spaceship”, and “orge” models, for example, have normal maps.

If you  come across a model that only has actual bump maps (only height component), you can also use a converter like this one to generate a normal map:

https://cpetry.github.io/NormalMap-Online/

For assignment  2c, you may be able to compute the partial derivatives analytically, if you choose  an  appropriate  function.  Otherwise,  you  can  either  use  finite  differences  or  use GLSLs built-in derivative functions:

https://registry.khronos.org/OpenGL-Refpages/gl4/html/dFdx.xhtml

Assignment 3: Animation (30 points)

Using  your  implementation  in meshrenderer.html from  assignments  1  and  2  as  a  starting  point, perform the following tasks and document them in README.md.

(a)  Implement  a  mechanism  for  animating  the  camera  path  using  a  spline  curve.  Provide  a suitable user interface for adding/removing keyframes based on the current camera matrix. When the playback is started, the camera should move along an interpolated path defined by these keyframes and, when stopped, return to its previous position. (20 points)

(b) Animate   other   meaningful   parameters,   e.g.,   colors   or   light   sources,   using   the   same mechanism. (10 points)

Up to 10 extra points will be awarded for documented additional features, e.g., animation of individual objects in the mesh, interactive picking and movements of objects, etc.

Hints:

。 You  will  need  to  decompose  the  view  matrix  into  its  translation,  rotation,  and  scale components,  interpolate  these  separately,  and  then  compose  the  new  interpolated  view matrix.

。 glMatrix      provides       the       functions       mat4.getRotation,      mat4.getTranslation,       and ma4.getRotation  for  extraction  and  mat4.fromRotationTranslationScale  to  put  the  matrix back together.

The  easiest  way  to  correctly  interpolate  rotations  is  using  quaternions   (this  is  what mat4.getRotation extracts), using spherical linear interpolation (slerp).

。 You        may        choose         any        type         of        spline,        but        Catmull-Rom        splines (https://en.wikipedia.org/wiki/Centripetal_Catmull%E2%80%93Rom_spline)   are  a  good choice for camera interpolation.

。 As the interpolation result from a spline is just a combination of weighted terms, it can be done using a series of linear interpolations/spherical linear interpolations. You just need to choose the correct weighting terms.

Assignment 4: Ray Tracing (30 points)

Using raytracer.html as  a     starting  point,  perform  the  following  tasks  and   document  them  in README.md.

(a)  Implement  ray  tracing  of an analytically defined scene, consisting of three spheres and a ground  plane.  The  ray  racer  should  support  shadows,  reflection,  and  refraction  and  the material  properties  of  the  spheres  should  be  set  to  demonstrate  that  these  effects  are possible. (25 points)

(b) Provide user interface controls to change the position and material properties of the objects in the scene.(5 points)

Up  to  10 extra points will be awarded for documented additional features, e.g., additional object types (box, cone, torus, ), soft shadows, blurry reflections, etc.

Hints:

The   entire   raytracing    algorithm   can    be   implemented   in   the    fragment   shader.   The raytracer.html  file  already  contains  code  for  setting  up  the  rays  as well  as a function for performing the ray-sphere intersection.

GLSL  shaders  do  not  allow  recursion,  but  this is also not needed to perform ray tracing. Simply use a loop with a maximum number of bounces (i.e., ray-object interactions).

。 You can implement the renderer either as a classical raytracer as discussed, but it is also equally  possible  to  implement  it  using  ray  marching  of distance  functions.  Make  sure  to document what you have implemented.