Since there are almost unlimited number of corner cases out there, what will the grading policy be in terms of correctness and robustness of the program?
Here are a few cases I can think of at the moment: Assume all links are good, and there is no obvious grammatical mistakes in the html source file, say missing tags, or the like:
1. if the official solution returns 10 links for a particular query, will the correctness be based on whether or not the 10 links are all returned? And also with the same ranking?
2. if the delta function and action table are modified, but still can't handle all possible cases, or even causing exceptions in uncommon ones, will these exceptions cause grade loss?
Or, as long as after WebCrawler is called (there might be some exceptions thrown for uncommon cases), WebSearch can return a reasonable amount of links with the right ordering according to their indegrees, full credit for correctness will be given?
|