THIS OPINION WAS INITIALLY ISSUED UNDER PROTECTIVE ORDER AND IS BEING RELEASED TO THE PUBLIC IN REDACTED FORM ON March 31, 1995 _________________________________________________________ GRANTED: March 3, 1995 _________________________________________________________ GSBCA 13129-P UNISYS CORP., Protester, v. DEPARTMENT OF THE AIR FORCE, Respondent, and TRW INC., Intervenor. C. Stanley Dees, J. Keith Burt, and Kurt J. Hamrock of McKenna & Cuneo, Washington, DC; Jeffrey P. Metzger, Thomas Miiller, and E. Charles Rowan, Jr., of Unisys Corp., McLean, VA, counsel for Protester. Clarence D. Long, III, Office of the General Counsel, Department of the Air Force, Washington, DC; Capt. Steven Joseph Dunn and Wilbert Wellons Edgerton, Department of the Air Force, Tinker Air Force Base, OK; and Maj. Anthony Larry Steadman and Capt. Brenda A. J. Paknik, Department of the Air Force, Maxwell Air Force Base, AL, counsel for Respondent. Kenneth B. Weckstein, Raymond Fioravanti, and Jose Otero of Epstein, Becker & Green, Washington, DC; Marsha A. Klontz of TRW Inc., Avionics and Surveillance Group, San Diego, CA; and Sheila L. Sparks of TRW Inc., Redondo Beach, CA, counsel for Intervenor. Before Board Judges DANIELS (Chairman), PARKER, and VERGILIO. PARKER, Board Judge. In this protest, Unisys Corporation challenges the Department of the Air Force's award of a contract to provide computer hardware, software, and associated engineering services. The contract awardee, TRW Inc., has intervened on the side of the Air Force. Unisys maintains that the Air Force's best value determination was irrational, that the method the Air Force used to evaluate past performance was fatally flawed, and that TRW's proposal failed to comply with a solicitation requirement to provide past performance information for significant subcontractors. For the reasons discussed below, we grant Count I of the protest. The Air Force's best value determination, as written and described by Air Force officials, does not support the decision to award the contract to TRW. The remaining counts are denied. Findings of Fact The Solicitation 1. On July 12, 1993, the Air Force issued a request for proposals (RFP) entitled Unified Local Area Network Architecture (ULANA) II. The RFP solicited offers to provide local area network (LAN) hardware, software, and associated engineering and support services to the Department of Defense and other Federal agencies. Protest File, Exhibit 6. The solicitation told offerors that the Air Force intended to award two contracts, each of which would be of the indefinite delivery, indefinite quantity type, with time-and-materials, fixed-price, and cost-reimbursable line items.[foot #] 1 Essentially, each ULANA II contractor will be required to respond to Government requests for a LAN by surveying the proposed site, designing a system, recommending the products needed for the system, and, after Government approval, installing the products and maintaining the system. Id. 2. Proposals were to be evaluated in three "areas" listed in descending order of importance: technical, management, and cost/price. Protest File, Exhibit 11. Each of these areas was further divided into "items," which were to be rated separately. Pertinent to this protest is the management area, which was divided into three items: life cycle support, warranty and maintenance, and program management. Id. 3. The technical and management areas (and their "items") were to be assessed in three different ways: a color/adjectival rating, a proposal risk rating, and a performance risk rating: ----------- FOOTNOTE BEGINS --------- [foot #] 1 One contract was awarded to Electronic Data Systems, Inc. (EDS). Unisys' protest challenges only the second award, to TRW. Protest Complaint. ----------- FOOTNOTE ENDS ----------- The color/adjectival rating depicts how well the offeror's proposal meets the evaluation standards and solicitation requirements. Proposal risk assesses the risk associated with the offeror's proposed approach as it relates to accomplishing the requirements of this solicitation. Performance risk assesses the probability of the offeror successfully accomplishing the proposed effort based on the offeror's demonstrated present and past performance. Within each area and/or item, each of the three ratings (color/adjectival, proposal risk and performance risk) shall be considered of equal importance in making an integrated source selection decision. Protest File, Exhibit 11. Finally, the cost/price area was to receive a performance risk rating that would be considered less important than the evaluated price. Id. An offeror's compliance with the RFP requirements in the technical and management areas (and items) was to be rated through the use of a color scheme: "blue" (exceptional) for proposals that exceeded the requirements in a beneficial way, "green" (acceptable) for proposals that met evaluation standards and had a good probability of satisfying the requirement, "yellow" (marginal) for proposals that failed to meet evaluation standards and had a low probability of satisfying the requirements, and "red" (unacceptable) for proposals that failed to meet minimum requirements. Id., Exhibit 1. Proposal risk for the technical and management areas was to be rated adjectivally as "high," "moderate" or "low" risk. Id. 4. Section M-703 set forth the process for evaluating an offeror's performance risk: Performance risks are those associated with an Offeror's ability to perform the solicitation's requirements as indicated by the Offeror's record of past and present performance on other contracts. Performance risk is assessed by the Performance Risk Analysis Group (PRAG) and is assigned an adjectival rating. This rating is derived from a narrative assessment of the Offeror's performance record after validation, review and analysis of the Contractor's prior relevant Government and non-Government contracts and data obtained by the use of customer feedback reports. The results of the complete analysis will be final separate technical . . . , management and cost performance risk assessments. The ratings will be stated as high, moderate or low. Protest File, Exhibit 11. The ratings were defined as follows: HIGH - Significant doubt exists, based on the offeror's performance record, that the offeror can perform the proposed effort. MODERATE - Some doubt exists, based on the offeror's performance record, that the offeror can perform the proposed effort. LOW - Little doubt exists, based on the offeror's performance record, that the offeror can perform the proposed effort. Id., Exhibit 18. In order to assist the PRAG in evaluating past performance, offerors were required to provide information on up to twenty-five relevant contracts performed by the offeror (or a major subcontractor, teaming partner or subsidiary) in the last three years. Id., Exhibit 6. 5. Section M of the RFP explained how the awardees would be selected: The Government intends to award two contracts to meet the ULANA II requirements. The Government will determine the two proposals that, each as a stand alone offer, provide the best and second best overall value to meet the Department of Defense requirements. . . . The Government is more concerned with obtaining superior technical and management features than with making awards at the lowest price. Awards will be determined by comparing differences in the benefit of technical and management features with differences in cost to the Government. Protest File, Exhibit 11. 6. A source selection plan established a source selection organization for the ULANA II procurement comprised of a Source Selection Authority (SSA), a Source Selection Advisory Council (SSAC) and a Source Selection Advisory Board (SSAB). Protest File, Exhibit 5. Section M-705 of the solicitation informed offerors of the roles that each of these entities would play in the selection process: Following completion of proposal evaluations by the SSEB, the evaluation results will be analyzed by the SSAC to determine the benefit of Technical, Management, evaluated cost/price areas, and past performance record. This cost/technical trade-off analysis will use the following methodology: 1. Determine Proposal Discriminators. This step determines the significant areas of differences (i.e., discriminators) among the proposals by reviewing the SSEB's evaluation results of the Offerors' proposals. 2. Identify Impact Areas. In this step, using the evaluation areas, items, factors, subfactors and standards, the SSAC will, possibly with the use of outside contractors, identify the impact of the discriminators. The SSAC may, with outside contractors, quantify the impact of these discriminators. 3. Best Value Determination. Based on an integrated assessment of the evaluation results and the cost/technical tradeoff, the source selection authority will determine which proposals represent the best value to the government. We are seeking a superior technical solution in accordance with the assessment criteria at a reasonable price. Id., Exhibit 11. Proposal Submission 7. Five offerors -- Unisys, TRW, EDS, AT&T, and General Wire Products -- submitted seven proposals in response to the RFP, with EDS submitting three separate proposals. Protest File, Exhibit 67. In its best and final offer (BAFO), TRW for the first time proposed its wholly-owned subsidiary, TRW Systems Service Co., Inc. (TSSCo), as a major subcontractor for the ULANA II contract. Protester's Exhibit 4. Although TRW did not include certain required past performance information on TSSCo, TSSCo had performed portions of the work for six of the twenty- two relevant contracts listed by TRW in connection with the past performance evaluation. Intervenor's Exhibit 1. No significant past performance problems were found with respect to those contracts, and Unisys has not explained how TRW's failure to provide additional information about TSSCo would have influenced the award decision. Proposal Evaluation The SSEB and the PRAG 8. As part of the SSEB's evaluation of proposals, the PRAG rated performance risk. The PRAG developed an extensive questionnaire which asked various questions about the offerors' past performance. The questionnaire, which was sent to Government personnel identified by the offerors in connection with their lists of relevant contracts, was divided into sections such that specific performance information would be obtained on the "areas" and "items" identified in the solicitation. Protest File, Exhibit 63. 9. The typical question asked for a multiple choice response, and provided several blank lines for additional remarks. Protest File, Exhibit 63. When the PRAG received a negative comment in one of the "remarks" sections, it sometimes, but not always, generated a clarification request, which asked the offeror to respond to the negative comment. Id.; Transcript at 175. The response from the offeror then was sent back to the originator of the negative comment for rebuttal. Transcript at 171. 10. The PRAG considered all of the information from the returned questionnaires, including the offerors' responses and the rebuttals thereto (when provided), and identified for each offeror strengths and weaknesses for each item. Transcript at 175, 273. These strengths and weaknesses were then used to assign risk ratings to the various items. The ratings for the items were then used to assign risk ratings to the three areas -- technical, management, and cost/price. Id. at 175. 11. Unisys' expert on survey research testified credibly that the PRAG process was not very scientific. The Board agrees with the expert's concerns about many of the questions which, because of their wording, required additional narrative remarks in order to answer clearly. See Transcript at 324-26. Clearly, the expert is correct that some people are more willing to provide additional remarks than others, a fact that makes the results of the PRAG survey somewhat unreliable. Id. at 428. We also agree with the expert's criticism of the PRAG's process for identifying strengths and weaknesses. Although the PRAG did consider all of the relevant information, the record shows that the strengths and weaknesses identified by the PRAG were directly related to the additional "remarks," rather than the responses to the multiple choice questions, which were not even tallied. See Protest File, Exhibit 63; Transcript at 273, 332-33. 12. Despite the unscientific nature of the PRAG process, a reading of the completed questionnaires discloses an undeniable dissatisfaction with Unisys' performance on certain contracts. Without going into detail, suffice it to say that a fair number of the commentors took the time to explain in the remarks sections their unhappy experiences in dealing with Unisys. Protest File, Exhibit 51. TRW had relatively few such remarks. 13. The SSEB's final evaluation can be summarized as follows[foot #] 2: UNISYS EDS TRW Technical Area Proposal risk Performance risk [technical "items" omitted] Management Area Proposal risk Performance risk Item 1 (life cycle) Proposal risk Performance risk Item 2 (warr. & maint.) Proposal risk Performance risk Item 3 (prog. mgmt.) Proposal risk Performance risk Cost/Price Area* Performance risk * "Evaluated Price," in millions Protest File, Exhibit 69. The SSEB did not attempt to combine the color ratings, the proposal risk ratings, and the performance risk ratings into an integrated score or rating for the technical and management areas, or to combine the evaluated prices with the performance risk ratings in the cost/price area. See Finding 3. The SSAC 14. The SSAC, with the help of outside consultants, reviewed the SSEB's final report and performed a detailed comparative analysis of the offerors' proposals. Protest File, Exhibits 67, 70. First, the SSAC accepted the SSEB's evaluation results with one exception: the SSAC changed Unisys' performance risk rating for management "item" 3 (program management) As a result, the overall performance risk rating for Unisys' management "area" changed Id., Exhibit 68. ----------- FOOTNOTE BEGINS --------- [foot #] 2 The evaluation results for the other offerors are not relevant to this protest. ----------- FOOTNOTE ENDS ----------- 15. The SSAC made the change because it determined that contracts on which performance problems were identified were more relevant to the management area than the PRAG's original relevancy ranking had estimated. The PRAG's relevancy ranking had been based primarily on technical factors and the SSAC felt that management issues were equally relevant, regardless of the technical issues involved in the contract. Transcript at 101-04. The SSAC did not reevaluate TRW's risk assessment by reevaluating relevancy, however. If it had, it is unlikely to have made much difference in the overall risk assessments. Protest File, Exhibit 68 at 36, 37, 45, 46, 50, 51. Thus, even if all the strengths and weaknesses were considered equally relevant, TRW would remain superior in that area. 16. Because the highest rated proposals were not the lowest priced, the proposals had to be compared in some manner. The SSAC attempted to do this comparison by performing a cost/technical trade-off analysis. First, out of about 1,400 technical standards taken from the RFP, the SSAC identified over fifty discriminators which it felt would have significant payoff possibilities for the Government. Protest File, Exhibit 67. A "discriminator" is an important technical or management standard which can be used to compare one proposal to another. These discriminators can be either "qualitative" or "quantitative." 17. Approximately twenty of the discriminators "could not realistically be quantified," so their significance was described qualitatively. Transcript at 83-85. Although no additional comparative analysis of the qualitative discriminators was made, the SSAC did note in its report that Unisys' "majors" were "close behind" TRW's. Protest File, Exhibit 67. 18. The value of the remaining discriminators was quantified through a process referred to as "normalization." Essentially, this process involved identifying the offeror which had the best proposal for a particular technical or management standard identified as a discriminator, and then estimating the cost necessary to bring each of the other proposals up to the level of the best one. Protest File, Exhibit 67; Transcript at 435-38. All of the necessary additions were then added to each offeror's evaluated price to come up with a "best value price." 19. In two relevant instances, the SSAC identified two different ways to bring the relatively deficient proposal up to the standard set by the best offered feature. In the discriminator called "low-end NMS [network management system]," the SSAC assessed an advantage to Unisys' proposal over TRW's proposal because Unisys' offered a faster computer for this feature. One way to quantify the difference was to determine the cost of buying the faster computer from another contract, in this case, Desktop IV. The difference determined by this method, the so called "low range" adjustment, amounted to about The "high range" adjustment of over was determined by calculating the additional cost of buying a different computer from another part of TRW's proposal, i.e., staying within TRW's ULANA II proposal. This other TRW-offered computer, however, was not comparable to the Unisys computer that the SSAC was trying to match because it was much more powerful. Protest File, Exhibit 67; Transcript at 137. 20. The other relevant area in which two different dollar figures were used was in a discriminator called "application metering." Here, TRW proposed a metering package which "controls the number of people who can access licensed software at any one time. As a result, a given number of users can be supported with fewer software licenses." Protest File, Exhibit 67. The SSAC quantified the low range adjustment for Unisys which represented the cost of purchasing the software from another contract. To get the high range adjustment of the SSAC calculated the cost of obtaining the additional licenses which the Air Force would need to purchase if it did not otherwise obtain the metering package. Id. 21. The results of the cost/technical trade-off were as follows: TRW's low range "best value price" was determined to be Unisys' low range price was TRW's high range best value price was determined to be with Unisys' coming in at Protest File, Exhibit 67. Thus, after trading off technical, management and cost factors, the SSAC determined that Unisys had a best value cost advantage of at the low range and using the high range adjustments. These "best value" prices did not take into account the effect of the non-quantified (qualitative) discriminators. See Finding 17. The SSAC made no effort to trade off or otherwise analyze the relative benefits of the past performance risk ratings as required by section M-705 of the RFP. Transcript at 84-85; Finding 6. The SSA's Decision 22. The SSA selected one of EDS's proposals as the "best value" and awarded EDS one of the two contracts. Protest File Supplement 2, Exhibit 6. The decision as to which proposal represented the second-best value, however, was more difficult. First, the SSA recognized in the "Technical and Management Assessment" section of her Source Selection Decision document that TRW offered the superior technical solution, an acceptable management solution, and had a strong past performance risk rating as compared to the past performance risk rating for Unisys. Protest File, Exhibit 70. 23. Next, the SSA proceeded to her "Best Value Analysis." After noting the evaluated prices the SSA posited the following "key question": whether TRW's technical superiority at a higher evaluated price but low risk based on past performance represented a better value to the Government than the Unisys Corporation's superior management proposal at lower evaluated price but with performance risk based on a past performance assessment. Protest File, Exhibit 70. The SSA answered this question by looking at the SSAC's cost/technical trade-off analysis: I have studied the results of the SSAC's technical trade-off analyses. Based on technical trade-off analysis, the SSAC assessed quantified differences between TRW and Unisys for the discriminatory technical and management capabilities described in Section III. The net result of this analysis is that the SSAC Report identified quantitative best value differences between the prices of Unisys and TRW which appear to give Unisys an advantage Id. 24. The SSA then departed from the SSAC's cost/technical trade-off in two areas. The first involved a Unisys advantage for its on-call maintenance program: After reviewing the rationale used by the SSAC in arriving at this range, I discussed with the SSAC their assumptions with respect to the best value analysis for the management area. Specifically, for Item 2 of Management Area (Maintenance) the SSAC gave Unisys a advantage on the presumption that more sparing[foot #] 3 would be required under the TRW MCBI [sic - stands for mail back, carry in] than the Unisys On-Call ----------- FOOTNOTE BEGINS --------- [foot #] 3 "Sparing" refers to the necessity of keeping spare parts on hand in the event of an equipment breakdown. Under the SSAC's analysis, the Air Force would have to spend an additional if it accepted TRW's proposal because TRW offered maintenance services that could take up to to accomplish. Unisys offered on-call maintenance. ----------- FOOTNOTE ENDS ----------- program. The past performance evaluations indicate that This will force equivalent sparing and/or result in unacceptable levels of system downtime, the operational impacts of which could very well exceed the spares savings advantage assess [sic] to Unisys in the technical tradeoff analysis. Conversely, TRW's past performance was evaluated in virtually every instance in strongly positive terms leading to the belief that they may actually perform better than promised. Accordingly, the SSAC and I agreed that Unisys should not be credited with this advantage. Protest File, Exhibit 70. Thus, the SSA decided not to credit Unisys for its maintenance advantage due to past performance deficiencies. Unisys, however, had been evaluated as having a "low" performance risk under management , "warranty and maintenance." See Findings 3, 13. The SSA determined that Unisys' performance ratings in the other management items, i.e., life cycle support and program management, were relevant to assessing the likelihood of receiving the benefits of Unisys' superior maintenance program. Transcript at 672, 677. 25. The second area of disagreement with the SSAC report centered around the NMS software discriminator: With respect to the technical tradeoff pertaining to the Network Management System (NMS), the SSAC indicated a disadvantage for TRW in a range of The low end of this range assumed that the government would obtain a more performance proficient hardware necessary for optimal performance of the software from another contract (e.g., Desktop IV). This clearly is the logical and reasonable course of action for the government to take. Accordingly, the apparent advantage for Unisys over TRW must be reduced by the difference, Protest File, Exhibit 70. Thus, the SSA determined that the low range amount (based on buying the computer from another contract) was the appropriate amount for this discriminator. See Finding 19. Implicit in the SSA's decision to subtract from the Unisys advantage, however, is her unstated decision to use the high range amount for the application metering discriminator. See Finding 20. In other words, if the low range were to be used for both discriminators, the difference in the best value prices is simply the low range best value difference calculated by the SSAC - 26. The SSA provided no credible reason for using the low range amount for one discriminator (the cost of purchasing the NMS computers from another contract) but not for the other discriminator (the cost of purchasing the application metering software from another contract at a similar savings). The SSA explained that she decided to use the high range amount for the application metering software discriminator because the SSAC could not specify a particular contract (such as Desktop IV for the NMS discriminator) through which Air Force personnel could purchase the application metering software. The record, however, shows that the Desktop IV contract is approaching its maximum order limitation and will not support the purchase of the NMS computers. Transcript at 738, 744-46. Moreover, the chairman of the SSAC testified that the SSAC verified that the Air Force actually could purchase the software "off-contract" as part of its determination to include the low range figure of Id. at 133. 27. After the SSA's two adjustments to the SSAC's cost/technical trade-off analysis, it becomes very difficult to tell what else went into the SSA's selection decision. At this point in the SSA's analysis, even with her two best value adjustments, Unisys still had about a "best value price" advantage. On the one hand, the SSA wrote: Over and above these considerations, the negative past performance assessment given Unisys give [sic] rise to larger concerns of ability to meet delivery and installation schedules. When programs are instituted to install Local Area Networks, there is a considerable commitment of resources, including civil engineering and government management resources. In addition, frequently specific mission needs or objectives depend on the timely accomplishment of technical infrastructure tasks such as the installation and operation of a local area network. To the extent that the negative past performance ratings for Unisys presage delivery delays and installation snags, the cost to the government over the life of the contract could well be in the tens of million of dollars. Hence, the SSAC and I have determined that there is no cost advantage, and potentially a cost disadvantage, were Unisys to be selected over TRW. Protest File, Exhibit 70. The SSA testified, however, that neither this item, nor those that follow it in the Source Selection Decision document, were "determining factors" in the award decision. Transcript at 712-18. At the SSA's deposition, which is in evidence, the SSA testified that this item, and the statements that follow it in the Source Selection Decision document, were not "factors" at all. Deposition of Darlene Druyun at 124. 28. The rest of the Source Selection Decision document states: However, these considerations alone are not the only reason to turn away from Unisys. Clearly, the data provided by the SSAC indicate a high probability that the hardware products and associated prices bid on these respective proposals will be replaced in large measure fairly early in the life of the contract. By way of contrast, the technical services, which are level of effort tasks, will remain essentially constant. For these engineering and implementation services, the price quoted by Unisys is approximately higher than any of their competitors, including TRW. There is no apparent value to the government from this higher price. With respect to the out year pricing, the SSAC concluded that some of the price difference between TRW and Unisys was due to Unisys' more aggressive discounting of pricing in the option years. If out year product substitution occurs as we have realized in other contracts of this nature, the relative differences in out year pricing would be substantially reduced. In addition to the quantitative features above, TRW proposed superior qualitative features in discriminators such as varying GOSIP support, gateway line speeds and increased expansions in hub capacity. The cumulative benefit of these features includes increased interoperability, enhanced productivity and future expandability. Since the source selection plan provides for primary emphasis on technical over management, and management over cost, and for equal weight to be given to past performance, and since TRW is clearly the superior technical solution, I took these considerations into account in determining that TRW is the best value second choice. Protest File, Exhibit 70. Again, although apparently important enough to write down in the Source Selection Decision document, the SSA testified that these items were not factors in the award decision. Transcript at 712-18. Discussion Motions to Dismiss TRW has moved to dismiss Count II of Unisys' protest as untimely filed. In Count II, Unisys maintains that the method used by the Air Force for assessing performance risk was fatally flawed. According to TRW, portions of this count are really nothing more than complaints about the terms of the solicitation, which should have been filed prior to the time for receipt of proposals. See Rule 5(b)(3)(i). We disagree with TRW's characterization of Count II. Unisys' complaint about the past performance evaluations clearly goes to how the proposals were evaluated, given the solicitation provisions as written. Because this ground was filed within ten working days of the time Unisys knew or should have known the basis for it, Count II was timely filed. Rule 5(b)(3)(ii). The Merits Count I In Count I, Unisys maintains that the Air Force's decision to award the second ULANA II contract to TRW was based on a fundamentally flawed best value analysis. According to Unisys, the best value determination is not supported by the Source Selection Decision document, the SSA's testimony regarding the determination, or TRW's "unilateral, post-hoc rationalizations." Protester's Post-hearing Brief. As discussed below, we agree with Unisys. The Board reviews best value determinations de novo. Grumman Data Systems Corp. v. Widnall, 15 F.3d 1044, 1046-47 (Fed. Cir. 1994). Our task in reviewing these determinations (and the cost/technical trade-offs that support them) is not to substitute the Board's judgment for that of agency officials, but to determine whether the considerable discretion granted agency officials in this area has been exercised in a reasonable way. Lockheed Missiles & Space Co. v. Department of Treasury, GSBCA 11776-P, 93-1 BCA 25,401, 1992 BPD 155, aff'd, 4 F.3d 955 (Fed. Cir. 1993). When an agency determines that a higher-priced proposal represents a better value than a lower-priced one, the record must "demonstrate with reasonable certainty consistent with the terms of the solicitation that the added value of [the proposal] is worth the higher price." B3H Corp. v. Department of Air Force, GSBCA 12813-P, 94-3 BCA 27,068, at 134,888, 1994 BPD 142, at 17, appeals docketed sub nom. Widnall v. B3H Corp., Nos. 95-1042, 95-1043, (Fed. Cir. Nov. 9, 1994). A cost/technical trade-off which "fail[s] to indicate whether the government would receive benefits commensurate with the price premium it proposed to pay" is legally deficient. Lockheed Missiles & Space Co. v. Bentsen, 4 F.3d 955, 959-60 (Fed. Cir. 1993). The Air Force's cost/technical trade-off analysis, and the resulting best value determination, fail to reasonably establish consistent with the terms of the solicitation that TRW's technically superior proposal is worth the additional cost. After the proposals were evaluated, TRW's proposal rated higher technically and Unisys' proposal was determined to be about cheaper in evaluated cost. Findings 13-15. Then, to compare the proposals to each other (and to others), the SSAC performed a cost/technical trade-off. After the trade-off analysis, which traded off cost and technical/management differences, but not performance risk, the SSAC found that Unisys had between a "best value price" advantage. Findings 16-21. This means that, after adding the amounts to each proposal necessary to bring them to technical equivalency, Unisys' proposal was still cheaper than TRW's proposal. Id. According to the solicitation, the SSA was to perform "an integrated assessment of the evaluation results and the cost/technical tradeoff," and then "determine which proposals represent the best value to the government." Finding 6. In the Source Selection Decision document, the SSA posed the following question: whether TRW's technical superiority at a higher evaluated price but low performance risk based on past performance represented a better value to the Government than the Unisys Corporation's superior management proposal at lower evaluated price but with performance risk based on a past performance assessment. Finding 23. Essentially, the SSA answered this key question by making a "best value analysis" which used as a baseline the SSAC's cost/technical trade-off. Findings 22-23. The SSA departed from the SSAC's trade-off analysis in two ways. First, she decided not to credit Unisys for its best value advantage of for offering a superior maintenance program. The SSA made the adjustment because she believed that, due to "serious performance deficiencies," the Government would probably not receive the benefit of Unisys' superior maintenance program. Finding 24. Unisys, however, had been rated as having "low" performance risk under the item "warranty and maintenance." Findings 3, 13. We do not agree that the SSA acted reasonably in allowing Unisys' risk ratings in other items to "bleed over" to the maintenance area so as to completely eliminate this specifically measured advantage. The fact that this was a best value procurement does not permit the SSA to ignore evaluation results without some indication that the ratings are erroneous or subject or a different interpretation. In a second adjustment to the SSAC's cost/technical trade- off analysis, the SSA determined that it was appropriate to use the low range amount for one discriminator that favored Unisys (NMS), and the high range amount for a similar discriminator that favored TRW (application metering). See Finding 25. This decision had the effect of raising Unisys' best value price by As discussed in detail in the findings of fact, the SSA's reason for treating these discriminators differently -- that the NMS computers could be purchased easily off contract and the application metering software could not -- is not supported by the record. Finding 26.[foot #] 4 ----------- FOOTNOTE BEGINS --------- [foot #] 4 TRW's expert in best value decisions provided another reason for using the high range amount for the application metering software discriminator: In fact they're really two different choices. In the network management system situation, you've got a case of buying off the contract to obtain the equivalent workstation, which was about versus buying the superior TRW high-end machine. We've talked about that. So that it was a choice between two different normalization techniques. But what I've said is that the high-end solution really wasn't normalization at all. It resulted still in a disparity. It has no relevance at all to the situation of application metering. Application metering presented the SSA with two numbers. One represented a normalization outcome, the cost to obtain the benefit of having application metering, the cost avoidance. So her choice there was do I use the benefit associated with that discriminator, or do I use the cost to obtain the benefit? And I think appropriately she chose to use the benefit associated with application metering. (continued...) ----------- FOOTNOTE ENDS ----------- At this point in the SSA's analysis, even with both of the adjustments to the SSAC's cost/technical trade-off, Unisys still had a best value advantage of about over TRW. Finding 27. In other words, without additional best value "factors" which weigh in favor of TRW, the award should have gone to Unisys, not TRW. We say this recognizing that the solicitation told offerors that proposals would be evaluated in the three areas, "technical," "management," and "cost/price," listed in descending order of importance. Finding 2. That provision does not, as the Air Force and TRW argue, permit the Air Force to award a contract to an offeror on the basis of technical superiority when a best value analysis shows a different proposal to be the better value. Such an interpretation would be inconsistent with other solicitation provisions which detail how the Air Force will conduct a cost/technical trade-off and a best value analysis. Section M of the solicitation, in addition to setting forth a specific procedure for trading off technical, management, cost, and risk factors, also provides as follows: The Government is more concerned with obtaining superior technical and management features than with making awards at the lowest [as opposed to the best value] price. Awards will be determined by comparing differences in the benefit of technical and management features with differences in cost to the Government. Finding 5 (emphasis added). Read together, the solicitation provisions mean that the Government will award the contract to ----------- FOOTNOTE BEGINS --------- [foot #] 4 (...continued) Transcript at 826-27. We have two problems with this expert's opinion. First, the SSA's choice to use the high range amount for the application metering discriminator had nothing to do with an "economic benefit" analysis. It had to do with the lack of an identified contract from which to purchase the software. Second, the Air Force used a "normalization" analysis, not an "economic benefit" analysis to perform its cost/technical tradeoff. Finding 18. There are many possible ways to compare cost and technical factors and, although we are certain that TRW's excellent lawyers and experts can find any number of ways to justify the award to TRW, there is no evidence that the Air Force either used such an analysis, or agrees now that such an analysis is the correct one. Although the Board is charged with reviewing the actions of the Air Force in this procurement de novo, the Board cannot substitute its judgment, or that of expert witnesses, for the judgment of the Air Force. Grumman Data _____________ Systems v. Department of the Air Force, GSBCA 11939-P, 93-2 BCA _______________________________________ 25,776, 1993 BPD 2, aff'd, 15 F.3d 1044 (Fed. Cir. 1994). _____ ----------- FOOTNOTE ENDS ----------- the offeror whose proposal turns out to be the best value (as determined by the comparative best value analysis), notwithstanding the fact that that proposal may not be the lowest priced. It clearly does not mean that the Air Force may award a contract to a technically superior offer where a best value price analysis shows another offer to be a better value to the Government. By our account, which eliminates much of the SSA's two adjustments to the SSAC's cost/technical trade-off, Unisys had a best value price advantage of about Although the Source Selection Decision document goes on to discuss other aspects of the proposals, the SSA testified that none of these items were "determining factors" in her decision to award the contract to TRW. Findings 28-29. Thus, as written and testified to, the best value analysis supports an award to Unisys, not TRW. One issue which the Source Selection Decision document discussed was the following: Over and above these considerations, the negative past performance assessment given Unisys give [sic] rise to larger concerns of ability to meet delivery and installation schedules. When programs are instituted to install Local Area Networks, there is a considerable commitment of resources, including civil engineering and government management resources. In addition, frequently specific mission needs of objectives depend on the timely accomplishment of technical infrastructure tasks such as the installation and operation of a local area network. To the extent that the negative past performance ratings for Unisys presage delivery delays and installation snags the cost to the government over the life of the contract could well be in the tens of millions of dollars. Finding 27. The effects of the performance risk ratings, weighted in accordance with the solicitation, should be a factor in this best value analysis. The solicitation specifically required the SSAC to "determine the benefit of Technical, Management, evaluated cost/price areas, and past performance record." Finding 6. Because the SSAC never performed the required benefit analysis of past performance, Finding 21, the task was left to the SSA on the last day of a seventeen month procurement. Thus, it is not surprising that the statement quoted above is not a reasonable analysis of the benefits of past performance. The Air Force should further analyze this issue when conducting a new best value analysis.[foot #] 5 The Air Force and TRW argue that, regardless of the SSA's actual reasons for selecting TRW, the Board should deny the protest because TRW's proposal was rated the highest in the most important area (technical), with a higher, but still reasonable, price (the least important area), and a low performance risk. Given these circumstances, they argue, the selection of TRW could never be unreasonable. The problem with this argument is that Unisys's proposal, with its evaluated price advantage and "best value price" advantage of about offset by somewhat higher performance risk, would also be a reasonable choice. The Air Force, which is charged with making the selection decision, has not provided a rational reason for selecting TRW over Unisys, either during the procurement, or during the protest by adopting for consideration by the Board an alternative analysis. A cost/technical trade-off which "fail[s] to indicate whether the government would receive benefits commensurate with the price premium it proposed to pay" is deficient. Lockheed Missiles, 4 F.3d at 959-60. The SSA's cost/technical trade-off, which failed to reasonably establish that TRW's proposal was worth the additional cost, was deficient. Accordingly, the award to TRW cannot stand. Count II In Count II, Unisys maintains that the method used by the Air Force to evaluate past performance was fatally flawed. We deny this ground of protest. We found as fact that the PRAG process, which essentially consisted of analyzing responses to survey questionnaires, was not very scientific. Findings 8-11. We also found, however, that the questionnaire responses disclosed an undeniable dissatisfaction with Unisys' performance on certain contracts. Although the process used to assess past performance may not be reliable enough to withstand challenges in all cases, in this case, Unisys has not convinced us that it has been prejudiced by the flaws in the process. Count III In Count III, Unisys argues that TRW's proposal should have been rejected by the Air Force because TRW failed to comply with a solicitation requirement to provide past performance information for significant subcontractors. Although TRW failed to provide all required information on its wholly-owned ----------- FOOTNOTE BEGINS --------- [foot #] 5 Similarly, it would be appropriate for the Air Force to consider as a factor in the best value determination a comparative analysis of the qualitative discriminators, a subject which was mentioned only briefly in the Source Selection Decision document. See Finding 28. ___ ----------- FOOTNOTE ENDS ----------- subsidiary, TSSCo, it did list among the twenty-two relevant contracts provided to the Air Force for past performance evaluation six contracts on which TSSCo had performed portions of the work. Finding 7. No significant past performance problems were found with respect to those contracts and Unisys has not explained how TRW's failure to provide additional information about TSSCo would have influenced the award decision. Id. Count III is denied for lack of prejudice. Decision Count I of the protest is GRANTED. The Air Force's delegation of procurement authority is amended to require it to make another selection decision based on a new best value analysis. Counts II and III are DENIED. _______________________ ROBERT W. PARKER Board Judge We concur: _______________________ STEPHEN M. DANIELS Board Judge _______________________ JOSEPH A. VERGILIO Board Judge