Milestone 2‎ > ‎

Submission instructions

On this page, you can find instructions to submit your code, check your submitted files and how to solve common submission problems. Follow these instructions to submit your code and Doxygen code documentation.

Remember that only the last submission made by any member of each team is considered for marking.

Test submit assignment 2

The testsubmitece297s script tests wether all required files are included in your submission, the project is buildable, and the available test cases pass. You can testsubmit assignment 2 as follows.

> /cad2/ece297s/public/testsubmitece297s 2

This script does not submit your code. You may use it as a check before you are ready to submit your code. Once you get the positive confirmation from the testsubmitece297s script (Your code is ready for submission), follow the instructions below to submit your code.

We highly recommend that you continue to use the testsubmitece297s script throughout the assignment and not leave it until the last minute before the deadline. This script is meant to help you understand what works and what tests pass successfully, and what you may want to fix before submitting. Note that we will process additional test cases with your submission, which are not included in the test cases available to you via the submission script. This may mean that while the script tells you to go ahead with your submission, your code may still exhibit bugs and may not pass all the test cases we will be subjecting it to (for marking).

Save the output of the Testsubmit script as follows.

> /cad2/ece297s/public/testsubmitece297s 2 > testsubmitAss2.log

Please make sure that you include the output file testsubmitAss2.log in storage-asst2.tgz as your are required to submit it as part of your code.

Code Submission

Make sure the FINAL versions of all your changes are added and committed to the Subversion repository. Don't forget to svn add and svn commit any new files (other than binary files) you created. Please ensure LOGGING = 0 for both client and server.

You will be submitting three files: storage-asst2.tgz will contain your complete source code, storage-asst2.diff will contain a record of the changes you've made to the code since the initial checkin, and doxygen-asst2.tgz will contain your Doxygen source code documentation. (To generate the .diff file, you will use the svn diff command. The .diff file lets us easily see what changes you've made to the skeleton code.)

  • Create and submit storage-asst2.tgz as follows.
> mkdir ~/ece297/submit
> svn export ~/ece297/storage ~/ece297/submit/storage-asst2
> cd ~/ece297/submit
> tar zcf storage-asst2.tgz storage-asst2
> rm -r storage-asst2
> submitece297s 2 storage-asst2.tgz
  • Create and submit storage-asst2.diff as follows.
> mkdir ~/ece297/submit
> cd ~/ece297/storage
> svn diff -r 1:HEAD > ~/ece297/submit/storage-asst2.diff
> cd ~/ece297/submit
> submitece297s 2 storage-asst2.diff
  • Create and submit doxygen-asst1.tgz as follows.
> mkdir ~/ece297/submit
> cd ~/ece297/storage/doc
> make
> tar zcf ~/ece297/submit/doxygen-asst2.tgz doxygen/
> cd ~/ece297/submit
> submitece297s 2 doxygen-asst2.tgz
  • Submit performance-asst2.pdf as follows (assuming the PDF file is in the ~/ece297/submit directory).
> cd ~/ece297/submit
> submitece297s 2 performance-asst2.pdf

Please ensure your census data is available at storage/data/census/ and storage/data/census.conf exists and is part of the submitted tar archive.

You can see what files you have submitted by entering:

> submitece297s -l 2

In particular check the file size and date to confirm the correct file was submitted.

Some of the marking tests are available at /cad2/ece297s/public/assignment2/a2-partial.tgz. You can try running them as follows.

> cd ~/ece297/storage/test
> tar zxf /cad2/ece297s/public/assignment2/a2-partial.tgz
> cd a2-partial
> make clean run

We highly recommend you make sure your submitted code passes the test suites given to you as part of the skeleton distribution and that it passed the testsubmitece297s test before submission.

These test suites were also mentioned here as part of instructions to set up the development environment. Follow the instructions below to unpack the code in a temporary directory and test it. (You can delete the temporary directory when you're done.)

> mkdir ~/ece297/tmp
> cd ~/ece297/tmp
> tar zxf ~/ece297/submit/storage-asst2.tgz
> cd ~/ece297/tmp/storage-asst2/test
> make run

We will mark your submissions partially based on the test cases given to you, so if these test cases fail, our processing of your submission is likely going to fail as well. Note that we will add additional test cases to evaluate your submission above and beyond the ones we are making available to you.

Check Submitted Files

We only mark the most recent files submitted by any member of the teams. You can check these files by entering:

> checksubmit 2

In particular check the files' size, date, and the user who submitted the files.

Common Submission Problems

SVN-Related Submission Problems

Try to avoid SVN-related submission problems as follows:

  • SVN commit: Before you export your code using svn export, you need to commit your code first. Failing to do this may result in submission of an earlier, possibly incomplete, version of your code (remember that your new files must first be added via svn add and then committed).
  • SVN export: If svn export complains that the destination directory already exists, you need to either use the --force option to indicate that you would like to overwrite the existing files in the destination dir, or first delete the destination dir using rm -r before exporting your code from SVN (Remember that either way, the files in the old destination directory are lost.)
  • SVN export does not work: If in the last minute you had svn export problems, simply copy your code directory to a separate directory, e.g., storage-asstX (replace X with the assignment number) and tar the directory:
 > tar -zcf storage-asstX.tgz storage-asstX/

You can then submit the tar file by following instructions in the assignments' submission pages. Remember that by following the above approach, you will lose the marks allocated for successful SVN export.

Other Common Submission Problems

Some other common submission problems happen when you include wrong files in your submission tar file.

  • Config file parsing: Remember that your server must read its startup parameters and their values from its config file. If you hard code these values in your program, this is very likely to cause major problems for your program. For example, do not hardcode values, instead, read them from the config file!)
  • Submitting wrong files: Before submitting your code, make sure that your tar file contains the correct source files. You can check this by using the command below, which shows a list of the archived files.
> tar -tf <submission-file>.tgz
  • Examining the files: To make 100% sure that the submitted files contain your latest code, try to un-tar your files in a separate directory and examine critical files (e.g., server.c and storage.c) individually. Open the files in an editor to make sure the files indeed contain the latest version of your code. You can un-tar the files using the following commands (Run this command in a separate directory that you create to examine your files - DO NOT do this in your main source directory)
 > tar -zxf storage-asst2.tgz

Marking Guidelines

Your assignment will be marked based on the code you submit and based on the design document you submit.

The code you submit for this assignment will be evaluated as follows. We will build your system and test it for correctness based on a set of test cases. For example, we will issue get/set calls and check for the return values and error conditions.

It is therefore critically important that you adhere to the Makefile template and code template we provide for you, as our build scripts will be based on these files. At the very least make sure the test suites given to you pass.

Marks will be allocated to a flawless build of your system. Should your build fail, we will try to fix the problem with application of reasonable effort, but would deduct some marks. Should our intervention not lead to successfully building your system, we will inspect your submission and allocate marks based on an assessment of the development effort, the documentation, the clarity and cleanliness of your code. However, marks will be deducted to differentiate from submissions that correctly build and pass test cases.

It is extremely important that you run the testsubmitece297s script before actually submitting your code so that you can make sure that all files are included, the project is buildable, and the available test cases pass successfully. The test cases available via the testsubmitece297s script account for 50% of the actual test cases used for marking your code.

Marks will also be allocated to the number of test cases your system correctly passes.

A perfect score for your code submission can be achieved if your system builds correctly and passes all test cases.

The design document you submit for this assignment will be evaluated on the basis of the engineering communication principles laid out in the course text book and explained in lectures. That is, we will not be grading on the basis of qualitative judgments of what makes good or bad writing, but rather how well you are able to implement awareness of: purpose and audience, overall organization, rhetorical tools, paragraph and sentence level clarity, as well as visual elements. The grading rubric is available here. The rubric defines qualities that exceed requirements, meet requirements or require revision to meet requirements.

In addition, Communication Instructors/Project Managers will offer individual comments that guide students toward revision and editing of the M2 Design Document, which will form the basis of the second design document. That document, and subsequent documents, will thus retain previous iterations, which have been revised and improved, as well as adding new sections to these.

A perfect score is possible if your document is thoughtful, technically accurate, economically and clearly written, supports every statement meaningfully with either logic or data (your own or from research sources), has absolutely no padding of any sort, and contains minimal to no errors in grammar, syntax or usage.

Tentative Code Marking Scheme

Here is a rough guideline for the marking scheme of the coding portion of this assignment:

  • [2 marks] Build: 2 marks if the server and client library build without error.
  • [2 marks] Proper submission: 2 mark is awarded for proper submission of the milestone.
  • [2 marks] Coding attempt: 1 mark each for a reasonable attempt to implement the server and client library.
  • [2 marks] Doxygen: 1 mark for adding your own Doxygen comments, 1 mark for submitting generated Doxygen output.
  • [11 marks] Basic tests: 1 mark for each basic test that passed. Functionality such as starting the server and connecting to it are tested.
  • [17 marks] Error tests: 1 mark for each error test that passed. Functionality such as handling invalid parameters in the client library functions are tested.
  • [9 marks] Implementation of proper client-server authentication, including failed authentication.
  • [12 marks] Integration tests: 1 mark for each integration test that passed. Functionality such as inserting and retrieving records from the server are tested.
  • [5 marks] Performance evaluation report.

The assignment is worth 62 marks in total.

Bonus Marks

In this assignment you may obtain the following bonus marks.

  • There is a bonus mark for early submission of your code. Your submission qualifies as early if you submit 7 full days before the advertised code submission deadline. You may not submit prior to submitting the design document. We want you to think, design, and plan before you code. Also, your submission must build successfully and at least 90% of our test cases must pass.