The safe middle space still does not involve a computer
Lots of my tests involved writing pseudocode, or "Just write something that looks like C or Java". Don't miss the semicolon at the end of the line, but if you write "System.print()" rather than "System.out.printLn()" you might lose a single point. Maybe.
If there were specific functions you need to call, it would have a man page or similar on the test itself, or it would be the actual topic under test.
I hand wrote a bunch of SQL queries. Hand wrote code for my Systems Programming class that involved pointers. I'm not even good with pointers. I hand wrote Java for job interviews.
It's pretty rare that you need to actually test someone can memorize syntax, that's like the entire point of modern development environments.
But if you are completely unable to function without one, you might not know as much as you would hope.
The first algorithms came before the first programming languages.
Sure, it means you need to be able to run the code in your head and be able to mentally "debug" it, but that's a feature
If you could not manage these things, you washed out in the CS101 class that nearly every STEM student took. The remaining students were not brilliant, but most of them could write code to solve problems. Then you got classes that could actually teach and test that problem solving itself.
The one class where we built larger apps more akin to actual jobs, that could have been done entirely in the lab with locked down computers if need be, but the professor really didn't care if you wanted to fake the lab work, you still needed to pass the book learning for "Programming Patterns" which people really struggled with and you still needed to be able to give a "Demo" and presentation, and you still needed to demonstrate that you understood how to read some requests from a "Customer" and turn it into features and requirements and UX
Nobody cares about people sabotaging their own education except in programming because no matter how much MBAs insist that all workers are replaceable, they cannot figure out a way to actually evaluate the competency of a programmer without knowing programming. If an engineer doesn't actually understand how to evaluate static stresses on a structure, they are going to have a hard time keeping a job. Meanwhile in the world of programming, hopping around once a year is "normal" somehow, so you can make a lot of money while literally not knowing fizzbuzz. I don't think the problem is actually education.
Computer Science isn't actually about using a laptop.
Maybe the middle space doesn't involve a compiler, but I really think computers should be allowed on tests, for a different reason: the computer makes it possible to write out of order. You can go back and add to the beginning without erasing and rewriting everything.
This applies to prose as much as code. A computer completely changes the experience of writing, for the better.
Yes, obviously people made do with analog writing for hundreds of years, yadda yadda, I still think it's a stupid restriction.
To a very limited extent, yes. But you'd need a lot of arrows to replicate what can be done on a computer. The computer completely frees you from worrying about space.
In my CS curriculum we learned SQL in theory only. We learned the relational model, normalization, joins, predicates, aggregation, etc. all without ever touching an actual database. In the exams we wrote queries in a paper "blue book" which was graded by teaching assistants.
Lots of my tests involved writing pseudocode, or "Just write something that looks like C or Java". Don't miss the semicolon at the end of the line, but if you write "System.print()" rather than "System.out.printLn()" you might lose a single point. Maybe.
If there were specific functions you need to call, it would have a man page or similar on the test itself, or it would be the actual topic under test.
I hand wrote a bunch of SQL queries. Hand wrote code for my Systems Programming class that involved pointers. I'm not even good with pointers. I hand wrote Java for job interviews.
It's pretty rare that you need to actually test someone can memorize syntax, that's like the entire point of modern development environments.
But if you are completely unable to function without one, you might not know as much as you would hope.
The first algorithms came before the first programming languages.
Sure, it means you need to be able to run the code in your head and be able to mentally "debug" it, but that's a feature
If you could not manage these things, you washed out in the CS101 class that nearly every STEM student took. The remaining students were not brilliant, but most of them could write code to solve problems. Then you got classes that could actually teach and test that problem solving itself.
The one class where we built larger apps more akin to actual jobs, that could have been done entirely in the lab with locked down computers if need be, but the professor really didn't care if you wanted to fake the lab work, you still needed to pass the book learning for "Programming Patterns" which people really struggled with and you still needed to be able to give a "Demo" and presentation, and you still needed to demonstrate that you understood how to read some requests from a "Customer" and turn it into features and requirements and UX
Nobody cares about people sabotaging their own education except in programming because no matter how much MBAs insist that all workers are replaceable, they cannot figure out a way to actually evaluate the competency of a programmer without knowing programming. If an engineer doesn't actually understand how to evaluate static stresses on a structure, they are going to have a hard time keeping a job. Meanwhile in the world of programming, hopping around once a year is "normal" somehow, so you can make a lot of money while literally not knowing fizzbuzz. I don't think the problem is actually education.
Computer Science isn't actually about using a laptop.