Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Evaluation Criteria to Match OCS Priorities #20

Open
mheadd opened this issue Feb 28, 2018 · 13 comments
Open

Update Evaluation Criteria to Match OCS Priorities #20

mheadd opened this issue Feb 28, 2018 · 13 comments
Assignees
Labels

Comments

@mheadd
Copy link
Contributor

mheadd commented Feb 28, 2018

The criteria in Section 5 of the draft will need to be updated to reflect OCS priorities.

We should also discuss evaluation % for cost. Requirement is 40%; however, as you see below (20%) – we can request a waiver if we can justify why.

Any changes should also be reflected in the table of contents.

Also, the cost proposal exhibit may need to be updated based on this discussion. In the exhibit, the total cost figure used is as follows:

TOTAL PROJECT BUDGET (not to exceed $300,000)

This may need to be altered based on team discussions.

@mheadd mheadd added the $$$$ label Feb 28, 2018
@randyhart
Copy link
Contributor

If it were up to me, I would lower the cost weighting to 20%. The vendors know that the budget for this is $300K. If we put the cost weighting hire than 20% it might lead to a "buying in" situation where a vendor submits a low bid just to get an award. The tech is more important than the cost, in my opinion.

@randyhart
Copy link
Contributor

@DanaPenner @susanjabal let me know if you want to discuss. Ultimately it's your call on this one.

@DanaPenner
Copy link
Collaborator

Discussed and approved to go ahead

@susanjabal
Copy link
Contributor

@DanaPenner @randyhart @mheadd
Per our discussion this morning - here are the eval weights I propose for consideration/adjustment:
Technical Understanding & Approach - 15%
Project Management & Approach - 5%
User Interface/Experience Design - 10%
Staffing Plan - 10%
Similar Experience - 10%
Verbal Presentation - 20%
Cost - 20%
AK Offerors Preference - 10%

@mheadd
Copy link
Contributor Author

mheadd commented Mar 19, 2018

Looks good to me, @susanjabal

@sztaylorakgov
Copy link
Contributor

Looks good.

@waldoj
Copy link
Contributor

waldoj commented Mar 19, 2018

We had no idea of how to score verbal presentation on our last DHSS procurement, so if we're going to include that as a separate factor, I think we have to agree on what we're measuring with it.

With the interview, we were asking questions about their proposal, and if they told us something about their staffing plan that was a red flag, it only made sense that it would impact their staffing plan score, not their verbal presentation. Ultimately, we didn't know what we were scoring. How well-spoken they were? That resulted in our retrospective citing the problem, in which we concluded that it didn't make sense to have a separate score for the verbals.

So if we're going to score verbals separately, I think we'll want to figure out what it is that we're evaluating, if not all of the other things.

@sztaylorakgov
Copy link
Contributor

You're right, @waldoj - the interviews were intended to be free form. So, we didn't go in with a strong scoring metric/approach. If we need something there, it might make sense to develop something based on things we'd like to see (or are concerned about) from the interview. Some things that come to mind after our DPA experience include:

  • Written proposal and interview agreement/disagreement
  • Ability to stay on point
  • Clarity on who should answer a question
  • Cohesion across the interview team, especially when subcontractors are involved
  • Rescuing (when one or two presenters are consistently re-framing what others on the team are fumbling with)
  • Ability to expand and improve our understanding of their written proposal
  • Ability to absorb new information and present relevant discussion of options
  • Interview elements that look like success/failure factors
  • Business culture compatibility
  • Team strength and weak links

Those are just some ideas.

@randyhart
Copy link
Contributor

I think it might make sense to consider the two things in the context of a job interview. Our evaluation criteria are similar to the "qualifications" you put out with the job posting.

The initial written proposal is like the resume and/or application from all of the applicants. We score these based on the evaluation criteria to make an initial cut of who the most qualified applicants are that should be asked to the job interview.

When they come in for the job interview, we are still interested in the same evaluation criteria. The interview is just another forum to have some due diligence before making the selection. After the interview, the "points" from the evaluation of the written proposal should be updated to account for any new information that comes from the one-on-one interaction.

Does that analogy make sense?

@sztaylorakgov
Copy link
Contributor

@randyhart I think the job interview analogy suggests that we have a single set of evaluation criteria and we score them initially, based on their written submission, and then a second time in a verbal interview.

I do like have a place to score distinctly for the interview, rather than just changing points in the pre-interview scoring categories. I think there was a sense in the DPA solicitation that maybe that's all we were doing, i.e., just going back and updating our scores in other areas. Then, I guess the question is: If we're only updating other scores, what is the point of separately scoring the interview?

We went into the DPA solicitation interviews with an approach that was open ended and probably focused on details that were covered in our other scoring categories. What I noticed is that where their answers really mattered, it was because they were able to move the line by either succeeding or failing - often times somewhat dramatically - at answering our questions. It wasn't just that they had a better or worse answer - in terms of the content - it was also that they were able to respond in the moment in a way that, for example, demonstrated team strength (or weakness), or their grounded experience (or lack thereof) in agile, etc.

I am advocating that there are other dimensions we should assess in the interview that don't fit in the written proposal. So, we should use the interview to improve our assessment of other scoring categories, and to assess those other "verbal only" dimensions. The ideas I put down above are a fair representation of what those other dimensions might be, though I was a little terse.

I also like giving 20% of the score explicitly to the interview because it conveys to the offeror that they have to be serious about that interview. I'm not too attached either way, though I really think the process we used for DPA served us reasonably well.

Sorry I ran on there a bit....trying to respond between meetings :-/

@susanjabal
Copy link
Contributor

Good morning everyone - I am back from a week away and working on these RFP issues today.

All of the input provided above is valid and makes sense - we could go either way on evaluating the verbal presentation separately or integrating it.
I suggest we keep the interview element separate, as it may encourage consistency in scoring by individuals on the team. I also suggest we offer a description in section 5.0: Evaluation Criteria, that defines the eval expectation of the verbal presentation, for us and for the vendor.

How's this:
Section 5.06 Verbal Presentation
The State will evaluate the offeror's ability to expand on, improve, and discuss the proposal sections described in RFP Section 4. The State will also evaluate the offeror's team dynamics, cohesion, and communication flow; and how compatible these are with the OCS team, the agile development process, and the QAP.

As suggested above - the resultant weights would be:
Technical Understanding & Approach - 15%
Project Management & Approach - 5%
User Interface/Experience Design - 10%
Staffing Plan - 10%
Similar Experience - 10%
Verbal Presentation - 20%
Cost - 20%
AK Offeror's Preference - 10%

thoughts?

@mheadd
Copy link
Contributor Author

mheadd commented Mar 29, 2018

@susanjabal @randyhart Once we resolve issue #16, I think we can finalize this one as well.

@mheadd
Copy link
Contributor Author

mheadd commented Jul 13, 2018

@sandralee19 Adding you to this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants