Assignment v1.0
Semester 1, 2020
Department of Computing
Curtin University

1 Introduction

In practicals you have implemented and learned about a number of algorithms and ADTs and will be implementing more of these in the remaining practicals. In this assignment, you will be making use of this knowledge to implement a system to explore and compare a variety of ADT implementations. Feel free to re-use the generic ADTs from your practicals. However, remember to self-cite; if you submit work that you have already submitted for a previous assessment (in this unit or any other) you have to specifically state this. Do not use the Java implementations of ADTs – if in doubt, ask.

2 The Problem

This assignment requires the extension of your graph code to apply it to social network analysis. You will be simulating the spread of disease through a population. In the network, at each timestep, there will be a probability of each connection infecting those they come in contact with. On the next timestep, the newly infected will have the potential of infecting their neighbours, and so on. We will have different categories of connections, which will affect the transmission rate. There will also be different interventions possible, reducing or eliminating some categories of transmission.
Your program should be called healthSim and have three starting options:
• No command line arguments : provides usage information
• "-i" : interactive testing environment
• "-s" : simulation mode
(usage: healthSim –s netfile trans_rate recov_rate death_rate int_code)
When the program starts in interactive mode, it should show the following main menu:
(1) Load network
(2) Set rates/interventions
(3) Node operations (find, insert, delete)
(4) Edge operations (like/follow - add, remove)
(5) New infection
(6) Display network
(7) Display statistics
(8) Update (run a timestep)
(9) Save network

You can structure the menu/UI differently, just make sure at least those options are included.

When running in simulation mode, you will give the input files and parameters on the command line, then output (append) the simulation state and statistics for each “timestep” to a file (unique filename per run).

In interactive mode you should be able to display the following statistics/information:
Overall statistics for the population
List of names in each health category (e.g. Susceptible, Infected, Recovered)
A person’s record (e.g. age, health status, date/timestep infected/recovered)

Once you have a working program, you will investigate the impact of the various parameters and interventions. This investigation will be written up as The Report.

Remember: think before you code!

3 Submission

Submit electronically via Blackboard.

You should submit a single file, which should be zipped (.zip) or tarred (.tar.gz). Check that you can decompress it on the lab computers. These are also the computers on which your work will be tested, so make sure that your work runs there. The file must be named DSA_Assignment_ where the is replaced by your student id. There should be no spaces in the file name; use underscores as shown.

The file must contain the following:
Your code. This means all .java/.py files needed to run your program. Do include code provided to you as part of the assignment if that is required to run your program.
README file including short descriptions of all files and dependencies, and information on how to run the program.
Your unit test harnesses. One of the easiest ways for us to be sure that your code works is to make sure that you’ve tested it properly. Make it easy for us to test your work - the test harness for class X should be called UnitTestX, or can be included in the class file for Python.
Documentation and Report for your code, as described in Section 2.1.
A signed and dated cover sheet. These are available from Blackboard with the assignment specification. You can sign a hard copy and scan it in or you can fill in a soft copy and digitally sign it.
Java Students:
Do not include .class files or anything else that we do not need. We will recompile .java files to ensure that what we’re testing is what we’re reading. We will use javac *.java to compile your files and run the unit tests by their expected names.
Make sure that your file contains what is required. Anything not included in your submission will not be marked, even if you attempt to provide it later. It is your responsibility to make sure that your submission is complete and correct.

3.1 Documentation and Report

You need to submit the documentation and report in docx or pdf format.
Your Documentation will be minimum 3-5 pages (excluding UML and Javadocs) and should include the following:
Overview Explaining any questions that the marker may have. This is supplemented by the
comments in your code. In general, if the marker is looking at your code and isn’t
sure why you have done something in that particular way, they will check your
documentation to see if they can find an explanation.
Traceability
Matrix
Mapping the requirements in the specification to the implementation and tests –
indicating whether they have been done and where to find more information.
UML Class
diagram
A UML class diagram of your code
Class
Description
A description of any classes you have, you need to let us know not only what the
purpose of that class is but why you chose to create it. As part of this, also identify
and justify any places where it was possibly useful to create a new class but you
chose not to, especially when it comes to inheritance.
Justification Justification of all major decisions made. In particular, when you choose an ADT,
underlying data structure or an algorithm, you need to justify why you chose that
one and not one of the alternatives. These decisions are going to be of extreme
importance in this assignment.
The Report will follow the structure of a standard academic report or paper. It should be at least 3-5 pages long. It is discussing the outcomes and application of your code – not how you’ve implemented it (that’s Documentation, above). Required sections are:
Abstract Explain the purpose of the report and state what you are investigating, and the
outcomes/recommendations.
Background Discuss your approach to developing the simulation code, and the aspects of the
simulation that you will be investigating.
Methodology Describe how you have chosen to investigate and compare multiple runs of the
simulation, and why. You should give some prediction of the expected
“performance” of the aspects of your code you are investigating – this includes
time complexity and/or memory usage. Include the commands, input files,
outputs – anything needed to reproduce your results.
Results Present the results of your simulations, and what you discovered.
Conclusion/
Future Work
Give your conclusions and what further investigations could follow. Do the actual
results match your prediction?
References Chicago referencing style

3.2 Marking

Marks will be awarded to your submission as follows:
Code
demonstration
[30 marks]
We’ll have a number of tests to run, and you get marks proportional to how
many tests your code passes. If your code passes all of the tests, then you
will get all of these 30 marks.
Documentation
[20 marks]
This is mainly things like choosing and justifying your structure, ADTs, and
algorithms. Note that you will only get these marks if you adequately justify
your decisions and properly implement them. So, for example, if you choose
an algorithm and adequately justify it but aren’t actually able to implement
it you’ll only get part of the marks.
Report
[20 marks]
As described in section 3.1, with the majority of marks for the methodology
and results sections.
Implementation
[30 marks]
We will use unit testing as well as looking at code quality; your test harnesses
will have a big impact on these marks. Students are encouraged to use
testing frameworks for their harnesses – this doesn’t give extra marks.
Bonus Marks
[5 marks each]
There will be bonus marks available exceptional/additional features.
Examples of enhancements could include: age, location, testing rates,
potential for individuals to not follow guidelines, contact tracing etc.
Marks will be deducted for not following specifications outlined in this document, which includes incorrect submission format and content and using built-in Java ADTs.
If the cover sheet isn’t provided with your submission, your submission will not be marked and you will be awarded zero (0) marks. If you forget to submit the cover sheet you will be allowed to submit it separately to the unit coordinator but will lose 5 marks.

3.3 Requirements for passing the unit

Students must submit an assignment worthy of scoring 15% to pass the unit.

This assignment has many correct solutions so plagiarism will be easy for us to detect (and we will).

For information about plagiarism, please refer to http://academicintegrity.curtin.edu.au.

In the case of doubt, you may be asked to explain your code and the reason for choices that you have made as part of coding to the unit coordinator. A failure to adequately display knowledge required to have produced the code will most likely result in being formally accused of cheating.

Finally, be sure to secure your code. If someone else gets access to your code for any reason (including because you left it on a lab machine, lost a USB drive containing the code or put it on a public repository) you will be held partially responsible for any plagiarism that results.

3.4 Late Submission

Late submissions will incur a 10% deduction per day.

3.5 Clarifications and Amendments

This assignment specification may be clarified and/or amended at any time. Such clarifications and amendments will be announced via Blackboard. These clarifications and amendments form part of the assignment specification and may include things that affect mark allocations or specific tasks.