5. Evaluation Activities
Proposal evaluation is an assessment of the proposal and the offeror's ability to perform the prospective contract successfully. The evaluation enables the selection of the offeror whose proposal represents the best value for the government.
5.4. (ref 1) Upon receipt of proposals, the contract team member(s) should ensure that each offeror has complied with the page restrictions outlined in the Request for Proposal and has included all required documentation (i.e. representations, certifications, Small Business Subcontracting Plan (if required), etc.). Each team member should begin by reading the Request for Proposal or solicitation, specifically the Evaluation Criteria (Section M or equivalent provision) and each offeror's entire proposal. It is often helpful to make notes as a memory jogger to be used when documenting the evaluation such as in the Rating Team Worksheet or in an electronic source selection tool. (The use of electronic source selection evaluation tools is strongly encouraged for more complex acquisitions.) Paragraph 7 of this procedure describes how evaluation results should be documented.
5.4. (ref 2) It is helpful, perhaps even necessary, for evaluators to keep the rating and evaluation criteria and definitions visible throughout the evaluation process. Use of an evaluation matrix or factor aid is a common practice.
[5.5]
5.5.1. When performing the mission capability evaluation, evaluators must ensure that the narrative assessment focuses on the offeror's proposal as it relates to the evaluation criteria (Section M or equivalent provision of the solicitation), not as it compares to other offerors' proposals. The evaluation must be based solely upon information gathered in the source selection process. Evaluators must guard against making assumptions relative to any individual offeror's proposal. [See FAR 15.305(a)(5) for small business evaluation considerations.]
When considering a blue rating, recognize that blue is earned based upon the magnitude of the additional benefit(s) to the government for the strength(s). The documentation should describe the magnitude of the benefit(s) of the strength(s). Keep in mind that the mere existence of a single strength does not necessarily merit a blue rating; though a blue may be warranted should that strength provide considerable benefit to the government. Conversely, a number of strengths may not merit a blue rating if the collective benefit is relatively minor. In any case, the documentation in the source selection record should describe the magnitude of the benefit(s) of the strength(s).
5.5.1.1. (ref 1) While the mission capability technical rating could also be affected by any identified uncertainties, by the time the final proposal revisions are requested, any uncertainties should have been resolved.
5.5.1.2. (ref 1) Evaluation teams need to address any risk mitigators. This could provide for more of a differentiation between proposals and a more meaningful tradeoff assessment for the source selection authority. Any identified mitigators should reduce the potential for disruption of schedule, increased cost, or degradation of performance, and will either be included in the contract or is inherent in the offeror's process. The sample analysis worksheets include areas to discuss offeror's mitigation efforts that are identified.
5.5.1.2. (ref 2) Some source selection team members are unsure of what would constitute a specific risk rating. There is also a possibility that offerors are equally unsure of how their approach may garner a certain risk rating. In order to better communicate how certain technical approaches would be viewed in terms of risk, the following guidance was included to give individuals a frame of reference. Some teams have included these in the RFP so that offerors would realize how approaches would be viewed by the Government. An aspect of a proposal that has been identified as an uncertainty or deficiency under Mission Capability could result in a Moderate, High, or even Unacceptable risk rating. This appropriately documents the failure (or uncertainty) to meet a requirement and the resulting risk to schedule, cost and/or performance. Other times, the proposal may meet or exceed requirements, but the approach to meeting/exceeding the requirements involves risk to schedule, cost and/or performance and will also drive a Moderate, High or even Unacceptable risk rating. The following are examples for what would be the basis to have certain assigned risk ratings:
(1) For a weapon system:
(a) If the offeror has not demonstrated the maturity of the proposed system, subsystems and components to a sufficient level for the acquisition phase of the program and the proposed cost and schedule are inadequate to deliver the proposed capability, then a "High Risk" or "Unacceptable" rating would likely be assigned. Indicators could include: (i) more than one key technology is at a Technical Readiness level (TRL) less than 6; (ii) a Preliminary Design Review (PDR) resulting in more than one waiver or significant action item; and (iii) integration and test efforts are not planned in detail so the Integrated master Plan/Integrated Master Schedule (IMP/IMS), Work Breakdown Structure (WBS), and Integrated Baseline Review (IBR) are insufficient to establish cost and schedule targets.
(b) If the offeror has demonstrated the maturity of the proposed system, subsystems and components to a sufficient level and there is a moderate expectation that the proposed mission capability will be achieved within the proposed cost and schedule, then in all likelihood a "Moderate Risk" rating would be assigned. Indicators could include key TRL levels at 6 with acceptable system engineering approaches and suppliers identified, but some differences exist between government estimates of costs, schedule, or other aspects of the offerors submission.
(c) If the offeror has clearly demonstrated the maturity of the proposed system, subsystem and components to a high level and there is a high expectation that the proposed mission capability will be achieved within the proposed cost and schedule, then a "Low Risk" rating would likely be assigned. Indicators could include: (i) all technologies required for the technical solution are at TRL 7 or greater; (ii) PDR is complete with no actions or waivers; (iii) scope of integration and test efforts are benchmarked and are reflected in a robust IMP/IMS: and (iv) contractor's plan reflects risk management and a methodology to conduct iterative trade studies to offer options as needed to complete the program within target costs and schedule.
(2) For a service:
(a) If an offeror's proposed approach to providing the required services has not been attempted or demonstrated to be successful in either a commercial or non-commercial DoD environment, then a High or Unacceptable risk rating may be appropriate, particularly if the required services are considered mission-essential. Indicators may be a staffing plan that appears to have an inappropriate span of control or a plan that appears to rely on extensive cross-utilization of personnel. If an offeror proposes an approach to operation and maintenance of a mission system involving transition from a legacy system to an updated system, without a testing or overlapping transition period, this would be rated "high risk" or "unacceptable."
(b) If an offeror's proposed approach to providing the required services has been successfully demonstrated in a commercial environment but never used for a DoD requirement, then a Moderate risk rating may be appropriate. An indicator may be an implementation plan that does not allow adequate time to have equipment and properly cleared personnel in place to start performance.
(c) If an offeror's proposed approach to providing the required services has been successfully demonstrated and proven to be effective in satisfying a DoD requirement, then a Low risk rating may be appropriate. Indicators would be proposed staffing with proven processes and implementation plans that reflect the offeror's technical expertise concerning the requirement. Any identified risks would have sound mitigation plans in place.
5.5.2. (ref 1) Information obtained in the past performance evaluation often has a bearing upon the overall consideration of other factors. For example: Consider a contractor who proposed to deliver an emerging technology to fill a contract requirement. The proposed approach promises tremendous benefit for the government, hence the mission capability technical rating for the specific subfactor is BLUE; but because the technology is unproven, the mission capability risk rating for that same subfactor is HIGH. A review of the offeror's past performance, however, reveals that the offeror routinely matures emerging technology and takes it to market -- on schedule and under budget. In his or her integrated assessment the Source Selection Authority may consider this information and effectively lower the proposal risk from HIGH to MODERATE or LOW. Such a conclusion must be fully documented.
5.5.2. (ref 2) The past performance evaluation should concentrate on assessing the delivery of an offeror's products and/or services, and be tailored to the mission capability factor/subfactors, Cost/Price factor, and other solicitation requirements that if not successfully accomplished could result in disruption of schedule, increased costs, or poor performance.
Early identification and use of past performance information to enable government evaluators to focus on this measure of the performance confidence assessment is critical. The past performance evaluation should concentrate on those aspects of the instant acquisition most critical to overall success. Evaluation of offerors' performance should focus on demonstrated performance in these specific areas. Evaluators should consider mitigating circumstances, such as process changes, that have resulted in improvements to previous performance problems. However, process changes should only be considered when objectively measurable improvement in performance has been demonstrated as a result of the changes.
It is important to remember that "past performance" and "experience" are not the same thing. Past performance evaluation is used to determine how well an offeror has performed previous efforts; experience is an indication of how often and the number of years (or months) an offeror has performed similar efforts, not necessarily how well the offeror performed.
Past performance is judged based on the whole record and not solely on the number of records. For example, an offeror with 30 relevant and recent past performance evaluations is not automatically superior to an offeror with only 10. It is unlikely that a straight comparison of the number of submissions should be the driving factor of the rating.
5.5.2. (ref 3) The Air Force Past Performance Evaluation Guide provides step-by-step guidance on accomplishing a Past Performance evaluation.
5.5.2. (ref 4) After reviewing the list of information provided by the offeror and the information gathered from other sources, the evaluation should concentrate upon recent and relevant contracts/programs/effort that will permit an in depth evaluation. More recent and more relevant performance usually has greater impact in the performance confidence assessment than less recent and less relevant performance. When determining the extent to which a referenced effort (e.g. contract or delivery order) is relevant, consideration should be given but not limited to such things as product or service similarity, product or service complexity, contract type, program or lifecycle phase, contract environment, division of company proposing, and subcontractor interaction. The evaluation should take into account past performance information regarding predecessor companies, key personnel who have relevant experience, or subcontractors that will perform major or critical aspects of the requirement when such information is relevant to the instant acquisition. The Air Force Past Performance Evaluation Guide provides a detailed discussion and examples of how to assess recency and relevancy.
Special consideration should be given to subcontractor past performance evaluation in teaming arrangements and when significant subcontracting effort is proposed. The FAR, as supplemented, states that when the solicitation includes the clause at FAR 52.219-8 or FAR 52.219-9, the evaluation shall include the past performance of offerors in complying with subcontracting plan goals for the affected concerns, monetary targets for small and small disadvantaged business participation, and notifications submitted under FAR 19.1202-4(b) (see also FAR 15.305(a)(2)(v) and DFARS 215.305(a)(2)).
5.5.2.2. Offerors without a record of relevant past performance or for whom information on past performance is not available or the offeror's performance record is so limited that no confidence assessment rating can be reasonably assigned will not be evaluated favorably or unfavorably. [FAR 15.305(a)(2)(iii) & (iv)].
5.5.3. (ref 1) Reference FAR 15.305(a)(1)).
The cost or price evaluation factor is normally limited to an assessment of reasonableness and in certain cases, realism (reference FAR 15.4 as supplemented for definitions of cost realism and price reasonableness).
1. Price Reasonableness. All source selections are conducted with the expectation of adequate price competition and rely on market forces to ensure awarded prices are reasonable. Only in extraordinary circumstances will additional information beyond proposed prices be necessary for the contracting officer to determine the price fair and reasonable.
2. Cost Realism. If a cost realism analysis is to be accomplished, the offeror should be advised that the Source Selection Authority will be shown both the government estimate of probable cost or price, and the offeror's proposed cost or price during the evaluation briefing. The evaluation criteria (section M of the Request for Proposal or equivalent solicitation provision) must clearly state how the cost evaluation is to be conducted.
3. Affordability. When trade-offs are considered, the cost factor definition must also consider affordability. When defining affordability, some acquisitions teams have found it prudent to share budget information; however, this practice may not be suitable for all source selections.
4. Data. The amount of price/cost data (FAR 15.402) requested in the Request for Proposal or solicitation should be limited to only the data absolutely necessary for making the reasonableness/realism assessment (FAR 15.403, as supplemented). The contracting officer, as supported by any price/cost analysis team members, is responsible for all aspects of price or cost evaluation; however, the CO bears the responsibility for determining the amount of price or cost information to be requested in the Request for Proposal.
5.5.3. (ref 2) When utilizing the MPC estimating process for ACAT I programs, cost uncertainty analysis should be conducted that allows a range (or distribution) of possible costs to be developed based on statistical techniques. This is necessary because the term "most probable" implies that other, less likely estimates exist. Cost uncertainty analysis quantifies uncertainty due to the variance in cost estimating methods, as well as uncertainty in the technical, schedule, performance and programmatic inputs. The application of the various statistical techniques inherent in uncertainty analysis will result in a mathematically correct most probable cost, a level of confidence, and the confidence levels for all other costs. Uncertainty analysis is highly recommended but not required for programs smaller than ACAT I. For additional information, reference the Air Force handbook of Cost Uncertainty and Risk Analysis. This handbook and related information is located on the FM Knowledge Now Website (https://afkm.wpafb.af.mil/ASPs/CoP/OpenCop.asp?Filter=OO-FM-CA-01).
5.5.4. MPC referred to in the mandatory procedures is the same thing as the probable costs referred to in the FAR. A draft program office estimate (POE) or Independent Government Estimate (IGE) with the cost estimating data and methodology should be established before the intense and extensive up-front communications with prospective offerors. The POE/IGE may be altered due to methodologies and data acquired during these discussions. The MPC developed for each offeror may use some of the estimating techniques from the POE/IGE depending on the proposed solution when compared to the assumptions for the POE/IGE. Many times the estimating techniques will be adjusted for the offeror's unique characteristics. The Cost/Price risk evaluation is the result of comparing and contrasting each offeror's MPC (and its associated uncertainty analysis) with each individual proposal (and its associated uncertainty analysis).
If a decision has been made to do a formal uncertainty analysis, the contractor shall provide both its proposed cost and its analysis of uncertainty as part of its proposal. In order to ensure that the Air Force and industry adequately understand the degree of cost/price risk associated with an offeror's proposal, the SSET must adequately communicate information about the rationale for the government's risk assessment of the Cost/Price Risk factor with each offeror during the discussions period after the competitive range determination but before request for Final Proposal Revision. The uncertainty analysis should be performed for ACAT I programs and the AF Cost and AF Cost and Risk Uncertainty Handbook can be used as a guide. This handbook and related information is located on the FM Knowledge Now Website (https://km.saffm.hq.af.mil/). Uncertainty analysis is highly recommended but not required for programs smaller than ACAT I.
[5.6]
[5.6.1]
5.6.1.1. The Source Selection Evaluation Team may recommend award without discussions at the Decision Briefing. In this instance a competitive range determination is not required. However, when awarding without discussions, the team should obtain approval from the SSA first, then obtain contract clearance. This is more efficient than performing contract clearance first, and subsequently having the SSA determine that discussions are necessary.
5.6.1.2 The issuance of Clarification and Communication ENs should not result in the revision of an offeror's proposal.
5.6.2. (ref 1) Normally, when the Source Selection Authority is other than the contracting officer, a competitive range briefing is conducted. The Competitive Range Briefing may be used to document the competitive range determination for the Source Selection Authority including the Source Selection Evaluation Team's interim evaluation of all offerors, and the Source Selection Evaluation Team's recommended "Evaluation Notices." The briefing is primarily used to obtain Source Selection Authority approval to enter discussions (issue Evaluation Notices), and/or eliminate offerors from the competitive range. When a competitive range briefing is required, charts should be developed in sufficient detail to support the contracting officer recommendations.
5.6.2. (ref 2) At this point in the process, there are frequently numerous issues to discuss with offerors. Therefore, it is especially important to explain clearly to the Source Selection Authority which issues are of greatest significance, particularly those for which it may be necessary to issue Evaluation Notices regarding deficiencies in the offeror's proposal.
5.6.2.1. (ref 1) Generally, past performance information is considered adverse if it supports a less-than-satisfactory rating on any evaluation element or any unfavorable comment received.
5.6.2.1. (ref 2) Reference FAR 15.306(a)(2), (b)(1), and (d)(3).
5.6.5.(ref 1) "Discussions" are required for those areas of a proposal that are considered deficient, where weaknesses exist, or where other aspects of the offeror's proposal (such as cost, price, technical approach, past performance, and terms and conditions) are significant enough to affect the selection decision, and/or where information presented by the offeror is unclear. These areas may include issues of compliance with the requirements of the Request for Proposal other than evaluation factors.
Discussions must be sufficiently robust to ensure a complete understanding of the proposal, and may include real time face-to-face dialogue as necessary. These detailed discussions can include any aspect of a proposal that the Air Force wishes to discuss, but as a minimum, should specifically address areas of identified or suspected risk to enable the government to quantify potential cost, schedule and performance impacts. Discussions may also use negotiations and bargaining, as described in FAR 15.306(d), which includes persuasion, alteration of assumptions and positions, give-and-take, and may apply to price, schedule, technical requirements, type of contract, or other terms of a proposed contract. The discussion phase permits offerors to formulate revisions to their proposals as necessary. Especially in those areas where proposal revisions require changes to contractually-binding documentation, "slip pages of different colors" may be provided by offerors to ensure incorporation of contractually -binding language prior to considering the issue resolved. The contracting officer also is encouraged to discuss other aspects of the offeror's proposal that could, in the opinion of the contracting officer, be altered or explained to enhance materially the proposal's potential for award. However, the contracting officer is not required to discuss every area where the proposal could be improved. The scope and extent of discussions are a matter of contracting officer judgment.
"Discussions" may be conducted either orally or in writing or both, determined by the nature of the issues to be addressed. The team determines what issues need to be addressed. However, keep in mind that the scope and extent of "discussions" are a matter of contracting officer judgment (FAR 15.306(d)(3)). For complex systems, complex services, solicitations where oral presentations are included, or other source selections where it is deemed valuable, the source selection teams should consider whether it is appropriate to award without discussions. If it does not make sense to try and award without discussions, the team can state in their source selection plan and RFP that they intend to enter discussions immediately upon receipt of proposals. This will eliminate the possibility of award without discussions, but will allow the SSET flexibility to ask questions immediately and gain information necessary to complete initial evaluations and possibly shorten source selection time periods. Successive competitive range determinations can be made after the conclusion of initial evaluations if necessary. The choice to open discussions immediately should not be restricted to source selections that employ oral presentations. In some cases it makes sense to open discussions immediately because there will be no possibility of award without discussions.
Before concluding discussions, teams should consider the release of a "pre-FPR" to include the ratings, the model contracts, and a specific delineation of any outstanding issues. This provides offerors the opportunity to resolve any potential issues prior to release of the FPR, and would logically reduce the likelihood of a substantial post-FPR discussions.
5.6.5. (ref 2) Teams may use the actual briefing charts that were used to brief the SSA as a method to provide the offeror the results of their ratings at the initiation of discussions and prior to final proposal revision request.
5.6.5.2. Oral "discussions" are a useful "discussion" method, but must be documented in writing for the official record. Notice of adverse past performance should be provided in writing through the issuance of an evaluation notice. When utilizing written "discussions," the Evaluation Notice form or similar form is normally used. Whatever method is chosen, "discussions" should be accomplished using the most efficient, economical, and timely means.
5.6.6. (ref 1) The request for final proposal revisions should highlight any remaining deficiencies in the offeror's proposal. No further negotiation occurs prior to the Source Selection Authority decision and award to the successful offeror(s) is made. In the event further "discussions" are required after receipt of the final proposal revisions, with Source Selection Authority concurrence the contracting officer may reopen "discussions"; however, great care must be exercised to avoid providing an unfair advantage to any offeror. (FAR 15.307)
5.6.6. (ref 2) The team should jointly evaluate the final proposals using the Rating Team Worksheets, or other similar document. Only one worksheet is completed for each offeror (unless the team is evaluating subfactors; in this case, the team should use one sheet for each subfactor per offeror). The "final" evaluation block of the worksheet should be checked.