Hi All,
Might as well outline some thoughts on the grading criteria for the project. These are indicative only and are not exhaustive. However, they give a fairly good idea of what you can expect. Pls ensure your deliverable doesn't lack substance in these broad areas.
1. Quality of the D.P.(s) - How well it aligns with and addresses the business problem vaguely outlined in the project scope document; How well can it be resolved given the data at hand. Etc.
2. Quality of the R.O.s - How well defined and specific the R.O.s are in general; How well the R.O.s cover and address the D.P.s; How well they map onto specific analysis tools; How well they lead to specific recomemndations made to the client in the end. Etc.
3. Quality and rigor of Data cleaning - The thinking that went into the data cleaning exercise; the logic behind the way you went about it; the ways adopted to minimize throwing out useful observations using imputations, for instance; the final size of the clean dataset that you ended up with for indepth analysis. The data section should contain these details, ideally.
4. Clarity, focus and purpose in the Methodology - Flows from the D.P. and the R.O.s. Why you chose this particular series of analysis steps in your methodology and not some alternative. The methodlogy section would be a subset of a full fledged research design, essentially. The emphasis should be on simplicity, brevity and logical flow.
5. Quality of Assumptions made - Assumptions should be reasonable and clearly stated in different steps. Was there opportunity for any validation of assumptions downstream, any reality checks done to see if things are fine?
6. Quality of results obtained - the actual analysis performed and the results obtained. What problems were encountered and how did you circumvent them. How useful are the results? If they're not very useful, how did you transform them post-analysis into something more relevant and useable.
7. Quality of insight obtained, recommendations made - How all that you did so far is finally integrated into a coherent whole to yield data-backed recommendations that are clear, actionable, specific to the problem at hand and likely to significantly impact the decisions downstream. How well the original D.P. is now 'resolved'.
8. Quality of learnings noted - Post-facto, what generic learnings and take-aways from the project emerged. More specifically, "what would you do differently in questionnaire design, in data collection and in data analysis to get a better outcome?".
9. Completeness of submission - Was sufficient info provided to track back what you actually did, if required - preferably in the main slides, else in the appendices? For instances, were Q no.s provided for the inputs to a factor analysis or cluster analysis exercise? Were links to appendix tables present in the main slides? Etc.
10. Creativity, story and flow - Was the submission reader-friendly? Does a 'story' come through in an interconnection between one slide and the next? Were important points highlighted, cluttered slides animated in sequence, callouts and other tools used to emphasize important points in particular slides and so on.
OK. Thats quite a lot already, I guess.
Sudhir
Might as well outline some thoughts on the grading criteria for the project. These are indicative only and are not exhaustive. However, they give a fairly good idea of what you can expect. Pls ensure your deliverable doesn't lack substance in these broad areas.
1. Quality of the D.P.(s) - How well it aligns with and addresses the business problem vaguely outlined in the project scope document; How well can it be resolved given the data at hand. Etc.
2. Quality of the R.O.s - How well defined and specific the R.O.s are in general; How well the R.O.s cover and address the D.P.s; How well they map onto specific analysis tools; How well they lead to specific recomemndations made to the client in the end. Etc.
3. Quality and rigor of Data cleaning - The thinking that went into the data cleaning exercise; the logic behind the way you went about it; the ways adopted to minimize throwing out useful observations using imputations, for instance; the final size of the clean dataset that you ended up with for indepth analysis. The data section should contain these details, ideally.
4. Clarity, focus and purpose in the Methodology - Flows from the D.P. and the R.O.s. Why you chose this particular series of analysis steps in your methodology and not some alternative. The methodlogy section would be a subset of a full fledged research design, essentially. The emphasis should be on simplicity, brevity and logical flow.
5. Quality of Assumptions made - Assumptions should be reasonable and clearly stated in different steps. Was there opportunity for any validation of assumptions downstream, any reality checks done to see if things are fine?
6. Quality of results obtained - the actual analysis performed and the results obtained. What problems were encountered and how did you circumvent them. How useful are the results? If they're not very useful, how did you transform them post-analysis into something more relevant and useable.
7. Quality of insight obtained, recommendations made - How all that you did so far is finally integrated into a coherent whole to yield data-backed recommendations that are clear, actionable, specific to the problem at hand and likely to significantly impact the decisions downstream. How well the original D.P. is now 'resolved'.
8. Quality of learnings noted - Post-facto, what generic learnings and take-aways from the project emerged. More specifically, "what would you do differently in questionnaire design, in data collection and in data analysis to get a better outcome?".
9. Completeness of submission - Was sufficient info provided to track back what you actually did, if required - preferably in the main slides, else in the appendices? For instances, were Q no.s provided for the inputs to a factor analysis or cluster analysis exercise? Were links to appendix tables present in the main slides? Etc.
10. Creativity, story and flow - Was the submission reader-friendly? Does a 'story' come through in an interconnection between one slide and the next? Were important points highlighted, cluttered slides animated in sequence, callouts and other tools used to emphasize important points in particular slides and so on.
OK. Thats quite a lot already, I guess.
Sudhir
No comments:
Post a Comment
Constructive feedback appreciated. Please try to be civil, as far as feasible. Thanks.