15 Tips & Questions for Requirements Gathering on Data & AI Projects
Gathering requirements on Data and AI projects is vital. Whether it’s Data Warehousing, Predictive Analytics, AI, Data Science, ML or IoT, when done successfully, it will help delivery and project management teams more accurately understand the scope of the initiative and ensure that the stakeholders will be ultimately satisfied with the result.
Some High Level Guidelines
15 Tips and Questions
1. Briefly describe the report/analysis/system/model/platform/project, including desired timelines.
2. How could the problem be solved manually? What is the human-level performance on this task at present?
3. What does the organisation/SME already know at a high level? Or what does the organisation/SME expect the results to show at a high level?
4. Have any assumptions been made so far? Have any of these assumptions been validated?
5. What are the organisations goals for the project? What defines success? What defines the proof of value? This needs to be accepted by the business as well as analysts.
6. What is the approximate dollar value of the expected return?
7. What are the use cases? i.e. How will the results be used (make business decisions, invest in product categories, work with a vendor, identify risks, etc.)?
8. Who is the audience that will use the results from the analysis (board members, sales people, customers, employees, etc.)? A table similar to the one below could be used to capture this information.
|Stakeholder Name||Position||Area of Expertise||Contact Details|
9. How will the audience interrogate the report/analysis/system/model/platform (ability to filter on key segments, look at data across time to identify trends, drill-down into details, etc.)?
10. Who should be able to access the information (think about confidentiality/security concerns)?
11. What does the existing technical architecture look like? Is an architecture diagram available?
12. Is the necessary data available? What data is required (tables and fields)? What is the size of each data set? A table similar to the one below could be used to capture this information.
|Table / Source (incl Data Format)||Field/s or Data Type||Data Size Approximation||Data Accuracy and Quality (general comments about the level of data cleaning, scraping, massaging, transformation and accuracy)|
13. What are the simple, logical, chronological, checkpoints that can be put into place to ensure that the right progress or outcome will be achieved?
14. Who will maintain the report/analysis/system/model/platform once productionised?
15. Is there anything specific that the stakeholders would consider ‘Out of Scope’?