The BOSS online submission and assessment system
Joy, Mike, Griffiths, Nathan and Boyatt, Russell. (2005) The BOSS online submission and assessment system. Journal on Educational Resources in Computing, 5 (3). Article 2. ISSN 1531-4278Full text not available from this repository.
Official URL: http://dx.doi.org/10.1145/1163405.1163407
Computer programming lends itself to automated assessment. With appropriate software tools, program correctness can be measured, along with an indication of quality according to a set of metrics. Furthermore, the regularity of program code allows plagiarism detection to be an integral part of the tools that support assessment. In this paper, we describe a submission and assessment system, called BOSS, that supports coursework assessment through collecting submissions, performing automatic tests for correctness and quality, checking for plagiarism, and providing an interface for marking and delivering feedback. We describe how automated assessment is incorporated into BOSS such that it supports, rather than constrains, assessment. The pedagogic and administrative issues that are affected by the assessment process are also discussed.
|Item Type:||Journal Article|
|Subjects:||Q Science > QA Mathematics > QA76 Electronic computers. Computer science. Computer software
L Education > LB Theory and practice of education > LB2300 Higher Education
|Divisions:||Faculty of Science > Computer Science|
|Journal or Publication Title:||Journal on Educational Resources in Computing|
|Publisher:||Association for Computing Machinery, Inc.|
|Page Range:||Article 2|
|References:||1 Bancroft, P., Hynd, J., Santo, F. D., and Reye, J. 2003. Web-based assignment submission and electronic marking. In HERDSA 2003. IEEE. Available: http://surveys.canterbury. ac.nz/herdsa03/pdfsref/Y1007.pdf (accessed: 30 May, 2004). 2 Bloom, B. S. and Krathwohl, D. R. 1956. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain. Longman, London. 3 Michael Blumenstein , Steve Green , Ann Nguyen , Vallipuram Muthukkumarasamy, GAME: A Generic Automated Marking Environment for Programming Assessment, Proceedings of the International Conference on Information Technology: Coding and Computing (ITCC'04) Volume 2, p.212, April 05-07, 2004 4 BOSS. 2004. The BOSS online submission system. online. Available: http://boss.org.uk/ (accessed 19 December, 2004). 5 Bull, J. and McKenna, C. 2001. Blueprint for Computer-Assisted Assessment. CAA Centre, University of Loughborough. 6 Philip J. Burton , Russel E. Bruhn, Teaching programming in the OOP era, ACM SIGCSE Bulletin, v.35 n.2, June 2003 [doi>10.1145/782941.782993] 7 Mark J. Canup , Russell L. Shackelford, Using software to solve problems in large computing courses, ACM SIGCSE Bulletin, v.30 n.1, p.135-139, Mar. 1998 [doi>10.1145/274790.273178] 8 CFL Software Development. 2004. Copycatch gold. online. Available: http://www.copycatchgold. com/ (accessed: 30 March, 2004). 9 Brenda Cheang , Andy Kurnia , Andrew Lim , Wee-Chong Oon, On automated grading of programming assignments in an academic institution, Computers & Education, v.41 n.2, p.121-131, September 2003 [doi>10.1016/S0360-1315(03)00030-7] 10 CIAD. 2004. TRIADS. online. Available: http://www.derby.ac.uk/assess/ (accessed: 25 April, 2004) 11 Kenneth M. Dawson-Howe, Automatic submission and administration of programming assignments, ACM SIGCSE Bulletin, v.27 n.4, p.51-53, Dec. 1995 [doi>10.1145/216511.216539] 12 Entwistle, N. 2001. Promoting Deep Learning through Assessment and Teaching. AAHE, Washington, DC. 13 Free Software Foundation. 2004. GNU general public license. online. Available: http://www.gnu. org/copyleft/gpl.html/ (accessed: 25 April, 2004). 14 Ghosh, M., Verma, B., and Nguyen, A. 2002. An automatic assessment marking and plagiarism detection. In ICITA 2002. IEEE. 15 Heng, P., Joy, M., Boyatt, R., and Griffiths, N. 2005. Evaluation of the BOSS online submission and assessment system. Tech. Rep. CS-RR-415, Department of Computer Science, University of Warwick Coventry, UK. 16 Colin Higgins , Tarek Hegazy , Pavlos Symeonidis , Athanasios Tsintsifas, The CourseMarker CBA System: Improvements over Ceilidh, Education and Information Technologies, v.8 n.3, p.287-304, September 2003 [doi>10.1023/A:1026364126982] 17 A. J. Hurst, Literate programming as an aid to marking student assignments, Proceedings of the 1st Australasian conference on Computer science education, p.280-286, July 1996, Sydney, Australia [doi>10.1145/369585.369650] 18 Peter C. Isaacson , Terry A. Scott, Automating the execution of student programs, ACM SIGCSE Bulletin, v.21 n.2, p.15-22, June 1989 [doi>10.1145/65738.65741] 19 Joy, M. and Luck, M. 1999. Plagiarism in programming assignments. IEEE Transactions on Education 42, 2, 129--133. 20 Joy, M., Griffiths, N., Stott, M., Harley, J., Wattebot, C., and Holt, D. 2002. Coresoft: A framework for student data. In Proceedings of the 3rd Annual Conference of the LTSN Centre for Information and Computer Sciences. LTSN Centre for Information and Computer Sciences. 31--36. 21 Donald E. Knuth, Literate programming, The Computer Journal, v.27 n.2, p.97-111, May 1984 [doi>10.1093/comjnl/27.2.97] 22 Derek Lane, JUnit: The Definitive Guide, O'Reilly & Associates, Inc., Sebastopol, CA, 2003 23 Leicester University. 2004. The CASTLE toolkit. online. Available: http://www.le.ac.uk/castle/ (accessed: 25 April, 2004). 24 Michael Luck , Mike Joy, A secure on-line submission system, Software—Practice & Experience, v.29 n.8, p.721-740, July 10, 1999 [doi>10.1002/(SICI)1097-024X(19990710)29:8<721::AID-SPE257>3.3.CO;2-S] 25 P. A. Macpherson, A technique for student program submission on UNIX systems, ACM SIGCSE Bulletin, v.29 n.4, p.54-56, Dec. 1997 [doi>10.1145/271125.271155] 26 Ralph C. Merkle, A fast software one-way hash function, Journal of Cryptology, v.3 n.1, p.43-58, 1990 27 MIT Usability Group. 2005. Usability guidelines. Online. Available: http://www.mit.edu/ist/usability/usability-guidelines.html (accessed: 30 September, 2004). 28 Nielsen, J. 2005. useit.com. Online. Available: http://www.useit.com/ (accessed: 30 September, 2004). 29 Questionmark. 2004. Questionmark Perception. online. Available: http://perception. questionmark.com/ (accessed: 30 March, 2004). 30 Kenneth A. Reek, The TRY system -or- how to avoid testing student programs, ACM SIGCSE Bulletin, v.21 n.1, p.112-116, Feb. 1989 [doi>10.1145/65294.71198] 31 Riku Saikkonen , Lauri Malmi , Ari Korhonen, Fully automatic assessment of programming exercises, Proceedings of the 6th annual conference on Innovation and technology in computer science education, p.133-136, June 2001, Canterbury, United Kingdom [doi>10.1145/377435.377666] 32 Ben Shneiderman, Designing the user interface (2nd ed.): strategies for effective human-computer interaction, Addison-Wesley Longman Publishing Co., Inc., Boston, MA, 1992 33 Jirarat Sitthiworachart , Mike Joy, Effective peer assessment for learning computer programming, Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education, June 28-30, 2004, Leeds, United Kingdom [doi>10.1145/1007996.1008030] 34 WebCT. 2004. WebCT. online. Available: http://www.webct.com/ (accessed: 30 March, 2004). 35 Daniel R. White , Mike S. Joy, Sentence-based natural language plagiarism detection, Journal on Educational Resources in Computing (JERIC), v.4 n.4, p.2-es, December 2004 [doi>10.1145/1086339.1086341] 36 Yau, J. and Joy, M. 2004. Introducing Java: A case for fundamentals-first. In EISTA 2004. 1861--1865.|
Actions (login required)