THIS OPINION WAS INITIALLY ISSUED UNDER PROTECTIVE ORDER, RELEASED TO THE PUBLIC IN REDACTED FORM ON APRIL 7, 1992, AND IS BEING RELEASED IN ITS ENTIRETY ON MARCH 22, 1994 DECIDED: March 19, 1992 GSBCA 11635-P GRUMMAN DATA SYSTEMS CORPORATION, Protester, v. DEPARTMENT OF THE AIR FORCE, Respondent, and CONTEL FEDERAL SYSTEMS, INC., Intervenor. Peter L. Winik, Franklin G. Snyder, and James H. Barker of Latham & Watkins, Washington, DC, and Patrick O. Killian of Grumman Data Systems Corporation, Bethpage, NY, appearing for Protester. Carl J. Peckinpaugh, James C. Dever, III, and Joseph M. Goldstein, Office of the General Counsel, Department of the Air Force, Washington, DC, and Richard C. Bean and Paul E. VanMaldeghem, Hanscom AFB, MA, appearing for Respondent. William H. Butterfield and Charlotte F. Rothenberg of McGuire, Woods, Battle & Boothe, Washington, DC, and Phillip L. Radoff of GTE Government Systems, Chantilly, VA, appearing for Intervenor-Respondent. Before Board Judges NEILL, HYATT, and WILLIAMS. Board Judge WILLIAMS. In this protest filed on January 2, 1992,[foot #] 1 Grumman Data Systems Corporation (Grumman) has challenged the Air Force's award of a contract for a network for the Joint Chiefs of Staff, known as the Joint Staff Automation for the Nineties (JSAN), to Contel Federal Systems, Inc. (Contel). Grumman claims that the award is illegal on four independent grounds: (1) the Air Force failed to make any meaningful cost-benefit analysis in awarding to a higher priced offeror; (2) the Air Force relied on a scheme of undisclosed technical evaluation criteria; (3) Contel's proposal failed to comply with the mandatory B2 security requirement; and (4) the Air Force failed to conduct adequate discussions. For the reasons stated below, the Board grants the protest finding that the Air Force did not sufficiently analyze whether the increased cost of Contel's proposal was justified. We thus revise the agency's delegation of procurement authority (DPA) to require it to perform a proper cost/technical tradeoff analysis, and after doing so to make an award based upon that analysis. Our suspension of respondent's DPA lapses by its terms. Findings of Fact On January 22, 1991, the Air Force issued request for proposals (RFP) number F19630-90-R-0006 for JSAN. Protest File, Exhibit 54. The procurement was to integrate existing computers and associated peripherals known as JIMS (Joint Staff Information Management Systems) with the contractor-provided JSAN system, and manage the transition between JIMS and JSAN with minimal disruption to the Joint Staff.[foot #] 2 Id. The RFP contemplated a four- to five-year transition period involving 200-400 users per year. Id. at 47. The RFP envisioned the award of an indefinite-delivery, indefinite-quantity type contract (IDIQ) for a base period and seven option years. The contract was to be fixed price for all Contract Line Item Numbers (CLINs) except the labor intensive CLINs, which would contain an economic price adjustment (EPA) provision. The Air Force's original DPA for this procurement was $115 million, but in September 1991 the Air Force received an upward revision to its DPA to the amount of $130 million. Protest File, Exhibit 2797. ----------- FOOTNOTE BEGINS --------- [foot #] 1 Protester was granted leave to file an amended complaint on January 14, 1992, which added a substantive amendment, making the due date for the decision on this protest March 19, 1992. Rule 7(f)(2). [foot #] 2 JIMS is an "umbrella term" which defines "a wide variety of hardware and software and operates as a series of parts rather than a unified whole." ----------- FOOTNOTE ENDS ----------- The Air Force Computer Acquisition Center's (AFCAC's) preliminary cost estimate for this procurement, as stated in the JSAN acquisition plan was $124 million. Transcript at 279; Protest File, Exhibit 111. AFCAC determined that, based on historical data, contract award price runs approximately 70 percent of the preliminary independent cost estimate (ICE). Transcript at 279. Ultimately, the Air Force's final ICE, which was completed after the acquisition plan, was $185 million. Transcript at 281; Protest File, Exhibit 2823. This procurement was conducted under Air Force Regulation (AFR) 70-30, Streamlined Source Selection Procedures, and was administered by AFCAC at Hanscom AFB in Massachusetts. Protest File, Exhibit 114 at 1. AFCAC employed the alternative source selection organization using a Source Selection Evaluation Board (SSEB), a Source Selection Advisory Council (SSAC), and a Source Selection Authority (SSA). The role of the SSEB was to conduct an in-depth evaluation and review of each proposal against the solicitation, the evaluation criteria, and the evaluation standards, and to prepare an evaluation report to the SSAC. AFR 70-30. The SSAC, which included AFCAC personnel and members of the Joint Staff, was charged with analyzing the findings of the SSEB and comparing proposals based on analysis of the SSEB findings and negotiations. Id. The SSA, the commanding general of the Standard Systems Center at Gunter Air Force Base in Alabama, was responsible for conducting the source selection process and making the final selection decision. Id.; Transcript at 478. AFR 70-30 required the Air Force to establish objective evaluation standards in addition to the requirements and evaluation criteria set forth in the RFP, but prohibited the Air Force from disclosing such standards to offerors.[foot #] 3 AFR 70-30 provides in pertinent part: a. The Technical Team will establish objective standards at the lowest level of subdivision of specific evaluation criteria. b. Standards, which indicate the minimum performance or compliance acceptable to enable a contractor to meet the requirements of the solicitation and against which proposals are evaluated, will be prepared for the lowest level of subdivision within each area of the specific evaluation criteria and be approved by the SSET chairperson. . . . ----------- FOOTNOTE BEGINS --------- [foot #] 3 AFR 70-30 defined evaluation standards as: "A statement of the minimum level of compliance with a requirement which must be offered for a proposal to be considered acceptable." Id. ___ ----------- FOOTNOTE ENDS ----------- Air Force Federal Acquisition Regulation Supplement, Appendix BB, 6 CCH Government Contracts Reptr. 40,953.90 at 28,983. Protester was familiar with AFR 70-30 and knew that the Air Force would employ evaluation standards which would not be released to offerors. Transcript at 646-47, 697. In accordance with AFR 70-30, prior to the issuance of the RFP, AFCAC and the Joint Staff developed standards which were approved by the SSAC. Transcript at 35, 41-42. The RFP: Requirements Section C of the RFP set forth the specific requirements of JSAN. Among the many requirements were four at issue here -- user friendliness, system architecture, security, and the connection between JIMS and JSAN. In addition, paragraph C.9.2 required that JSAN should support all functional requirements by minimizing the need for human intervention. Protest File, Exhibit 54 at 61. The RFP's Requirements for User Friendliness In Section M, the RFP listed six technical evaluation areas in descending order of importance, without specifying numerical weights. The first technical area was user friendliness. Section C.9.3, User Friendliness, provided: 9.3.1 At the system level the Contractor shall provide a user friendly interface as described in C11.3 to all functional capabilities and network services. . . . This flexibility extends to such aspects of workstation operations as: what the user sees, how the user enters information and commands, and how the user receives replies. The interface shall be uniform in appearance and function. Capabilities shall exist to allow the experienced user to bypass basic steps. . . . Id. System Architecture System Architecture was the second most important technical area. Id. at 320. Section C.10.1.2, described this requirement as follows in pertinent part: The anticipated architecture is a high-speed network with definable communities-of-interest. The network shall be designed such that there is no single point of failure for the network as a whole. . . . Id. at 63. The JIMS/JSAN Connection Section C.10.1.6, CLIN 0090, required the contractor to provide "a connection between JSAN and the current JIMS network for the entire transition period to the JSAN environment." Protest File, Exhibit 54 at 66; RFP Figure C-23; Transcript at 727-29, 877. The RFP's illustrative diagram of the JIMS/JSAN transitional period depicted the Wang VS 300 as the vehicle to connect JIMS to JSAN. Protest File, Exhibit 54 at 179, Figure C23; Transcript at 728. Although this illustrative diagram did not show any other gateway, it did show "other connected internal and external systems." Protest File, Exhibit 54 at 179. There was in fact another pre-existing vehicle in JIMS, the Wang VS 10000, at which point the RFP required a connection between JIMS and JSAN. Transcript at 728-29. Paragraph C.4.1 described the connection between JSAN and JIMS as a "bridging capability." Id. at 47. The RFP did not specify how the connection between JIMS and JSAN was to be achieved; nor did it expressly limit the number of gateways through which the connection could be achieved. The Security Requirement Security was the fourth most important of the six technical areas. The solicitation delineated evolving requirements for system security in three phases. In Phase 1, the contractor was to "maintain the current features and capabilities of the existing security system . . . begin[ning] as the JSAN Contractor assumes hardware and software maintenance responsibilities for JIMS and . . . continu[ing] until no JIMS components are in service." Id. at 58. In Phase 2 the system was to provide multiple levels of security. Id. Phase 3, which is at issue here, was described in pertinent part: C.9.1.4 Security Phase 3 - This phase begins when ordered by the Government, but no sooner than 24 months after contract award. . . . During this phase the Contractor shall: . . . . 9.1.4.3 Provide, as a minimum, an NSA [National Security Agency] Evaluated B2 TCB [Trusted Computing Base] or a TCB that is in the Formal Evaluation phase of the NSA evaluation process for B2 . . . . The TCB shall be certifiable in the Joint Staff environment. As a minimum, the TCB shall provide functionality at the B2 level of trust . . . . Id. at 60.[foot #] 4 As reflected in Amendment 3 to the solicitation, the Air Force clarified the security requirement in answering question 59. The question pointed out inter alia that NSA had publicly estimated it takes over two and one half years to evaluate a B2 level product and asked the Government to reevaluate this requirement. The Air Force responded: Although highly desirable, there is no requirement for any portion of the TCB during any phase to ever successfully COMPLETE NSA evaluation. . . . Regardless of whether NSA accepts a component of the TCB for evaluation . . . documentation must be submitted to the government as stated in C9.1.4.3. NSA has advised us that the length of the vendor assistance phase is solely determined by the vendor. NSA has also stated the average time from initial submission to a fully evaluated product is . . . 12 - 18 months for B2 products. . . . The Government requires TCB components with commercial appeal to be, as a minimum, in the formal evaluation phase because it does not intend to fund developmental efforts by the Contractor. Protest File, Exhibit 2855 at 19-20. Section C of the solicitation also listed desirable features. Protest File, Exhibit 54; Transcript at 60. Proposal Preparation Instructions and Evaluation Factors Section L.18.1 of the RFP, the instructions on preparation of technical proposals, addressed delayed deliverables as follows: For those items designated as delayed deliverables, provide a certification letter from the source of the product that the requirements will be met. Protest File, Exhibit 54 at 275. Section L.21 included lists of specific technical questions organized under the six areas for technical evaluation. Id. at 283-87. The stated purpose of the questionnaire was: ----------- FOOTNOTE BEGINS --------- [foot #] 4 The Air Force's security evaluation team included two representatives from NSA as well as a representative from the Air Force Cryptological Center. Transcript at 315. ----------- FOOTNOTE ENDS ----------- [T]o obtain supplementary information concerning the proposal to assist the Government in evaluating proposals against solicitation evaluation criteria, or to enhance the Government's understanding of the degree to which the proposal meets evaluation standards. Replies will be used, as indicated in Section M3, in evaluating proposals against evaluation criteria. Id. at 282. Among the eight technical questions on user friendliness was the following: How does your implementation make menu and icon commands and instructions easy to understand? Describe how the applications give users feedback on complex actions by unobtrusively guiding the user. Id. at 283. Under System Architecture, the RFP included the following questions: Describe your communications network design in detail to include: . . . for each network component, . . . redundancy, expected availability, avoidance of single points of failure, and any other details you feel are necessary to fully describe your network design. Describe failure points and the expected number of users affected if failure occurs. (Reference: C10.1, M3.3.1.2.1) What functions (applications) will not be available for a workstation in the standalone mode if the remainder of the network is unavailable? (Reference: C3.2, C11, M3.3.1.2.3) Describe the methodology for relieving workstation administrative load . . . . Include how many hours per week it will take to centrally administer a 20 user community-of-interest. (Reference: C10, M3.3.1.2.3) Describe the system administration required to operate and maintain corporate storage. . . . Indicate the number of manhours required per week to support the administrative workload for the initial configuration of corporate storage? (Reference: C10.6, M3.3.1.2.2) Id. at 284 (emphasis added). Paragraph M.2 "Basis of Award" provided that once a proposal had been determined to have met all mandatory solicitation requirements and to have complied with law: the Source Selection Authority (SSA) will determine the proposal which provides the best overall value to satisfy Air Force needs. This selection will be based on how well each proposal satisfies the evaluation criteria as described in paragraph M3 and also on an integrated assessment of the proposals submitted in response to this RFP. Id. at 318. Paragraph M.3, "Evaluation Criteria," stated: 3.1 General. The evaluation will be based on an integrated assessment of the offeror's proposal. The integrated assessment of the offeror's proposal will address mandatory requirements, terms and conditions, technical factors, life cycle cost, and level of Live Test Demonstration (LTD) performance. The offeror's proposals will be evaluated and rated[[foot #] 5] by the Government. This assessment will address two areas (listed in order of importance) of consideration which are: Technical Cost Id. at 319. The RFP did not state how much more important technical was. In the area of technical evaluation, the solicitation provided for "evaluation credit" as follows: Offered components, features and capabilities beyond the minimum mandatory requirements specified in the RFP, Section C references below will be evaluated depending upon the degree to which they benefit the Government. To receive evaluation credit the offered additional component, feature and/or capability beyond the minimum requirements must substantially benefit the Government as determined by its contribution to the specific items, factors, and considerations identified below. Protest File, Exhibit 54 at 319. The six technical items were listed in descending order of importance: ----------- FOOTNOTE BEGINS --------- [foot #] 5 The solicitation did not specify how proposals would be rated, i.e., numerically or otherwise. In fact, as discussed infra, the Air Force rated proposals in accordance with _____ the color coding scheme set forth in AFR 70-30. ----------- FOOTNOTE ENDS ----------- User Friendliness System Architecture Transition Security Maintenance Management No weights were assigned. The RFP provided that proposals would be evaluated in each technical and cost criterion for compliance with the requirement and soundness of technical approach. Id. For each of the six technical items, specific assessment criteria were then listed in descending order of importance, unless expressly stated to be otherwise. User Friendliness: RFP Evaluation Provisions Within the first item, user friendliness, there were two evaluation factors expressly stated to be equal in importance: user interface/integration and application software functionality. Id. at 320. User interface/integration had five considerations: a. The degree to which software applications are intuitive. b. The degree to which program function keys and/or menu interfaces are consistent across applications and keystrokes are minimized. c. The degree to which an UNDO feature is available within all applications. d. The degree to which the help and error messages are easily understood and context sensitive. e. The degree to which text and graphics can be integrated and moved between the various software applications. Id. (citations omitted). Application Software Functionality had one consideration: The number of desired capabilities provided by the offeror. Id. (citations omitted). System Architecture: RFP Evaluation Provisions Within the technical item system architecture there were three factors listed in descending order of importance: Communications network Workstation Corporate storage approach Within these factors there were three considerations under both communications network and workstation and one under corporate storage. The pertinent consideration under communications network was "redundancy/availability of network . . . and how system prevents single points of failure." Id. The first consideration under workstation was "degree of functionality in standalone mode." Id. at 321. Factors and considerations were set forth similarly under other items. Id. The RFP: Cost Evaluation The RFP provided: M4 Contract Life Cycle Cost Evaluation. 4.1 General. The Government's Most Probable Cost (MPC) for the total life cycle of the contract will be determined by using the prices, terms and conditions of the CLINs/SLINs . . . in conjunction with their associated quantities, delivery dates, and probabilities of occur[r]ence as contained in the cost model. The Government's evaluation will be based on purchase only plans. In addition, an economic analysis that considers the current Treasury Bill Discount Rate will be conducted to determine the present value of monies. Id. at 324. The Evaluation Process The technical evaluation consisted of a two-step process: validation and evaluation. Transcript at 151. Validation was a determination of whether a proposal met the RFP's minimum mandatory requirements and was eligible for award. Id. Validation began when the proposals came in on approximately March 29, 1991. Id. at 155. Following validation, evaluation of proposals against the standards checklist commenced, and evaluators reviewed proposals as well as the offerors' responses to the questionnaires in Section L. Id. at 155. In order to receive evaluation credit for exceeding the minimum requirements, an offeror had to meet the standards. Id. at 36. In some instances, the standards quantified the RFP's general requirements. Protest File, Exhibits 554, 2754. In some cases there were no minimum mandatory requirements which precisely matched up to the standards. Transcript at 36. Technical evaluators prepared an Evaluation Standards Checklist for each proposal and, for each standard, determined whether a proposal met (indicated by a checkmark), failed (indicated by a -), or exceeded the particular standard (indicated by a +); these checklists also described the strengths and weaknesses of the proposal vis a vis each standard. Protest File, Exhibits 2736-39, 2752-54. Discussions were conducted and Best and Final Offers (BAFOs) were received on November 15, 1991. Id. at 157. Deficiency Reports (DRs) There were hundreds of DRs issued in this procurement. Protest File, Exhibits 161-983.[foot #] 6 DRs were specific to each proposal, and given the number of DRs the wording was not always exactly the same for the identical problem in proposals. Id. at 102-03. Very few clarification requests (CRs) were issued. Transcript at 189-90. DRs Regarding Single Points of Failure On or about April 25, 1991, the Air Force sent Grumman the following DR: How does the proposed hierarchical multi-tier tree network minimize single points of failure when the failure of a single smart hub on tier #1 will cause the loss and isolation of 200 to 600 (or 1200?) users from the subsystems, corporate storage, and the remainder of the network? Protest File, Exhibit 450. On or about April 26, 1991, the Air Force sent Contel the following DR: Is the Government to understand that the offeror will not connect more than 192 users to a router? If this is not correct, then please provide the maximum number of users (worst case) that will be served by a single router and (1) still meet the performance requirements in attachment 4 and (2) minimize a single point of failure that would cause the loss or isolation of a large group(s) of users. Protest File, Exhibit 932. At the time of the April 26 DR to Contel, the Grumman system design did not have the "bridging routers" or "brouters" that ----------- FOOTNOTE BEGINS --------- [foot #] 6 Grumman received approximately 638 DRs and Contel, approximately 564 DRs. Protest File, Exhibits 161-983, 1856-1986, 1987-2113, 2418-63, 2468-2514, 2516-17, 2623-36, 2641-2656. ----------- FOOTNOTE ENDS ----------- ultimately caused the Air Force concern. Later, the design was changed to add brouters. Transcript at 193, 270, 706-08, 747. Under Grumman's revised design, at least one community of interest with 320 potential users was connected to the network through a brouter. Grumman did not receive a second DR expressing concern about the number of users connected to a brouter. Transcript at 88-89, 709-13. DRs Regarding Gateway Between JIMS and JSAN: Contel and Grumman each proposed a single gateway connection. The Air Force sent Contel a DR citing RFP Section C.31 [the section on operational concept which mandated minimizing single points of failure] stating: Are you proposing only one hub and 802.3 connection to JSAN? If so, it appears the failure of this link will cause the loss and isolation of all the IMS users from JSAN (and vice versa). How does this design minimize single points of failure and disruption as specified in the RFP? Protest File, Exhibit 930; Transcript at 95-96. Contel added a second gateway connection between JIMS and JSAN through the Wang VS 10000. Protest File, Exhibit 1784; Transcript at 96. This second connection proposed by Contel does not go through the Wang VS 300. Id. at 878. The Air Force sent Grumman a DR referencing RFP Section C10.1.6 [JIMS Network Connection] stating: It is not clear how one Smarthub with one 802.3 10 Base T connection to the 7040-3 communication processor will be sufficient to handle all traffic between JIMS and JSAN. Also, the failure of this link (or hub) will apparently cause the loss and isolation of all the JIMS users from the JSAN users. Protest File, Exhibit 534. In response, Grumman stated that its approach was low risk, that in light of the temporary, short-term nature of the JIMS environment, the reliability of its technology and the simplicity of replacement in case of failure, it felt that the cost/benefit tradeoffs favored its approach. Protest File, Exhibit 1386. Grumman also said: "The JIMS gateway host and its subsystems also represent a single point of failure that cannot be obviated by multiple hubs or links." Id. Grumman never considered adding two redundant connections through the VS 300 because the VS 300 was less reliable than the smarthub Grumman had proposed and "connecting it to a unit that still had a very high failure rate . . . would not have obviated the problem." Transcript at 730-31. Also, Grumman believed that the VS 300 was part of the JIMS system and that the RFP did not permit changes to the internal JIMS system. Id. at 732. DRs Regarding Security Both offerors were issued DRs asking about their solutions to the twenty-four month delayed deliverable, CLIN 1090, for B2 level Trusted Computing Base (TCB). Protest File, Exhibits 302, 973, 2029, 2311, 2901. Contel received a DR which stated: Offeror needs to certify that the trusted MACH kernel and the B2 TCB developed by Trusted Information Systems (TIS) can be supplied by the offeror when ordered by the government approximately 24 months after contract award. The certification must come from TIS or the offeror and signed [sic] by someone of authority (not marketing representatives) on company letterhead and dated. The word "certify" must be used. The certification letter must be addressed to the prime contractor or to AFCAC. Protest File, Exhibit 2311. In response, Contel stated: . . . Trusted Information Systems (TIS) did not participate in the preparation of Contel's proposal and . . . there is no present agreement between TIS and Contel for TIS' participation in the work effort. If Contel receives the award, it may seek to negotiate a contract with TIS in accordance with the previous discussions. Alternatively, Contel as the systems integrator has the capability of performing the B2 TCB work inhouse or of securing the services of another knowledgeable vendor, such as Secure Ware, to do the work. Protest File, Exhibit 2131. Grumman received a DR which stated: Offeror's response to DR QY158 provided an AT&T certification letter saying the AT&T SV/MLS ES would be submitted for evaluation if the government ordered CLIN 1090. The current evaluation is based on a different platform. The NCSC will not allow B2 product to participate as part of the RAMP program. This in essence means the offeror must have his proposed TCB and platform go through the complete evaluation program. The RFP requires an already evaluated B2 TCB or one which is in phase 3 (Formal Evaluation) of the NCSC Evaluation Program [to] be delivered within 60 days after issuance of a delivery order. The delivery order will not be issued any earlier than 24 months after contract award. If AT&T does not intend to submit the AT&T SV/MLS ES for evaluation until ordered, how will the B2 TCB be in phase 3 (Formal Evaluation) of the NSCS [sic] Evaluation Program within 60 days[?] Protest File, Exhibit 2901. Grumman responded: Attached Certification (DOC 677) from AT&T provides an update and clarification to the previous AT&T B2 Certification. It describes a vendor program which will follow the upcoming NSA initiatives for B2 RAMP and "Porting" to ensure certification at the B2 level 24 months after contract award. . . . Protest File, Exhibit 2594. Evaluation Results The pertinent color and risk assessments used in the evaluation ratings were defined to mean: Blue: Exceptional. Exceeds specified performance or capability in a beneficial way to the Air Force; and has high probability of satisfying the requirement; and has no significant weakness. Green: Acceptable. Meets evaluation standards; and has good probability of satisfying the requirement; and any weaknesses can be readily corrected. Yellow. Marginal. Fails to meet evaluation standards; and has low probability of satisfying the requirement; and has significant deficiencies but correctable. *** High Risk--likely to cause significant serious disruption of schedule, increase in cost, or degradation of performance even with special contract[or] emphasis and close Government monitoring. Moderate Risk--can potentially cause some disruption of schedule, increase in cost, or degradation of performance. However, special contractor emphasis and close Government monitoring will probably be able to overcome difficulties. Proposal Evaluation Guide (PEG), Protest File, Exhibit 114 at 12, 13.[foot #] 7 Contel was rated Green/Moderate Risk in the overall Technical Area whereas Grumman was rated as Yellow/High Risk. Protest File, Exhibit 2823; Transcript at 230-31. The technical items were individually rated as follows: Grumman Contel User Friendliness Yellow Green System Architecture Yellow Green Transition Yellow Yellow Security Green Yellow Maintenance Blue Green Management Green Yellow The risks for Grumman and Contel were as follows: Item Grumman Contel User Friendliness High Moderate System Architecture Moderate Low Transition Moderate Low Security Moderate High Maintenance Low Moderate Management Moderate Moderate Protest File, Exhibit 2825 at 30-37. The soundness for Grumman and Contel were as follows: Item Grumman Contel User Friendliness Marginal Acceptable System Architecture Acceptable Acceptable Transition Acceptable Acceptable ----------- FOOTNOTE BEGINS --------- [foot #] 7 The SSEB chairman testified that the increase in cost in a high risk proposal meant Government cost. Transcript at 232. ----------- FOOTNOTE ENDS ----------- Security Acceptable Marginal Maintenance Exceptional Marginal Management Acceptable Marginal Id. Evaluation of Grumman's Proposal as to User Friendliness The SSEB's item summary for user friendliness explained the rationale for Grumman's Yellow/High Risk rating as follows: Rating Assessment: This vendor was rated Yellow for the first factor (User Interface/Integration) and Blue for the second factor (Application Software Functionality). In the first factor the vendor exceeded 1 of the 5 standards and failed the other 4. Based on an overall assessment of the 5 standards and the relative degree of compliance (how close were those standards that failed and how well the one standard was exceeded) the vendor was judged to be Yellow for the factor. The fact that the vendor did very poorly and was not close in meeting any of the failed standards, the Vendor, although rated yellow was at a minimal level within that rating. A significant problem is that the character-based applications do not fully support the capabilities of the Graphical User Interface (GUI). The lack of GUI applications severely detracts from the ability to implement integrated suite of software applications. The second factor consisted of only 1 standard. The vendor exceeded this standard and was rated Blue for the factor. Even with equal weight given to each of the two factors, the minimal yellow rating of the second factor can not allow a higher rating than yellow for the item. The preponderance of potential risk for the overall implementation for this item lies with the first factor (User Interface/Integration). Therefore due to the increased risk, the item is given an overall risk rating of high. Protest File, Exhibit 2823 at 87. In the factor summary for user interface/integration which was not part of the SSEB report, the evaluators gave the following rating assessments and identified the following weaknesses in Grumman's proposal: RATING ASSESSMENT: The main office automation software proposed . . . is a character-based application which has been modified to run in Motif. As a result most of the commands use a menu structure designed for character-based terminals or use arcane escape and control key sequences. A total of 6+ of 17 applications were assessed as user friendly or intuitive. . . . In addition, integration across applications is not evident. Each package (often supporting a group of applications) had its own set of menus and function keys. Help is inconsistent between packages and often not context sensitive, providing only an index of topics rather than help on the current function. The proposed software does not provide user friendly software for the primary office automation applications . . . . . . . . WEAKNESSES: The primary office automation package . . . does not take advantage of the GUI capabilities of the x-windows interface. . . . The graphics software is not part of the . . . software, and integration between the two is very weak. The user interface software . . . is very weak and provides little to enhance the system usability. Each software package has its own menu structure and function keys, methods for obtaining help, saving data, etc. Copy and paste, particularly for graphics objects, was weak, even though the data could be moved between applications by other methods (import/export). System administration must be performed from a single workstation; i.e., it cannot be performed from any workstation on the network, only the specified system admin workstation. The undo functions in the graphics software was not consistent. Protest File, Exhibit 2795. This factor summary was the result of an integrated assessment of Grumman's evaluation against the five standards. Transcript at 233-34. Although offerors were rated as either exceeding, passing, or failing a given standard, there were degrees of each. Id. at 234. The Evaluation of Grumman's Proposal Against the Five Standards for User Interface/Integration Grumman failed the first standard under user interface/integration which was met when eight or more software applications were demonstrated to be intuitive. Protest File, Exhibit 2754. Six and one-half of Grumman's applications were rated as intuitive, making the rating a stated "20% below the minimum to meet the standard." Id. The most used programs for the Joint Chiefs are word processing, electronic mail, and composition graphics. Transcript at 393. Grumman also failed the second standard for user interface/integration which was met "when 50-70% of applications demonstrated use the same function keys for similar operations; [n]o operations demonstrated require more than 4 keystrokes." Protest File, Exhibit 2754. Two evaluators submitted the same comment: Only 4 of 17 applications . . . use the same function keys for similar operations . . . . Whether the questionable . . . product plus twelve other products can provide a user friendly environment is highly risky. Id. Grumman failed Standard 4, i.e., whether "help messages are easily invoked [and] help and error messages are easily understood and context sensitive." Id. The evaluators concluded that not all software had context-sensitive help. Id. Standard 5 was whether "movement and integration of text and graphics can be accomplished." Id. One evaluator found Grumman met this standard, stating: The vendor demonstrated the capability to move text and graphics between applications. However, the method is not consistent . . . . The vendor describes how x- Clipboard does not support vector and raster graphics, and has not built a user interface that shields the user from the implementation of the movement of text and graphics . . . . Weakness: does not provide an integrated, consistent approach to moving text and graphics. Id. The other evaluator determined Grumman failed this standard, stating: Offeror proposes using x clipboard to accomplish this, which will work between text-oriented applications, but is not error-free for conversion between graphics software application (either among graphics-oriented packages or between graphics and text-oriented applications). Id. The Evaluation of Contel's Proposal as to User Friendliness The User Friendliness Item Summary for the Contel proposal stated, in part, the following: Rating Assessment: This vendor was rated Green for the first factor [Interface/Integration]. . . . In the first factor the vendor exceeded 1 of the 5 standards, met 1 standard and failed to meet 3 standards. Based on an overall assessment of the 5 standards and the relative degree of compliance (how close were those standards that failed and how well the standard was met or exceeded) the vendor was judged to be Green for the factor. The second factor consisted of only 1 standard. The vendor exceeded this standard and was rated Blue for the factor. However due to the marginal nature of the Green rating for the first factor, the overall rating for the Item is Green. The preponderance of potential risk for the overall implementation for this item lies with the first factor (User Interface/Integration). Therefore this item is rated overall moderate in risk. Protest File, Exhibit 2823 at 107. The factor summary for Contel in user interface/integration provided in part: User Interface/Integration RATING ASSESSMENT: The main office automation software proposed, Asterix, is a fully-WYSIWYG [what you see is what you get], GUI package designed with user friendliness as a paramount consideration. A total of 10 of 17 proposed applications were assessed as intuitive or user friendly. However, integration across applications was not evident. Each package (often supporting a group of applications) had its own set of menus and function keys. Help was inconsistent between packages. Copy and paste of information between windows of different applications was demonstrated. One of the copy and paste demonstrations in the LTD did not work properly . . . and the proposal indicates the capability will not be available until the graphics conversion software is available one year after contract award. The proposal satisfies the requirements for user friendliness for most primary applications, but lacks integration between applications. . . . . WEAKNESSES: Electronic mail software was cumbersome to use and not well integrated with the other office automation software. Mail screens were cluttered and held too closely to the "standard" Unix mail analog, a notoriously unfriendly electronic mail system. The mapping software demonstrated at the LTD did not run under the proposed window manager. . . . All system administration tasks demonstrated required command line use of Unix commands. This would require substantial training of system administrators and does not provide the reduced level of system administration workload expected. The integration of user interface between applications was not evident. Each software package had its own menu structure and function keys. Many of the commonly used commands required more than four keystrokes to complete. The undo functions in the graphic artist software demonstrated at the LTD were inconsistent. Moving text and graphics between applications is inconsistent and, in some cases, could not be accomplished in the LTD. Help screens were inconsistent between applications and, in some cases, were not context sensitive. . . . This is particularly a weakness for the electronic mail software. Protest File, Exhibit 2796. The Evaluation of Contel's Proposal Against the Five Standards for User Interface/Integration Contel met the first standard regarding intuitive software, having offered ten applications which were demonstrated to be intuitive and received a "high checkmark" or high passing evaluation from one evaluator for this standard. Protest File, Exhibit 2746. Electronic mail, however, was noted to be cumbersome and a "substantial weakness" under this same standard. Id. Contel, like Grumman, failed the second standard because only three of seventeen applications used the same function keys for similar functions. Id. Both Contel and Grumman exceeded the third standard which was met "when 80-90% of all commands demonstrated for UNDO can undo the previous command." Id.; Protest File, Exhibit 2754. Like Grumman, Contel failed standards 4 and 5. Protest File, Exhibit 2746. Evaluation of Grumman's Proposal in System Architecture In the second technical area, system architecture, Grumman was rated Yellow/Moderate Risk. Protest File, Exhibit 2823; Transcript at 241. The SSEB Report provided in pertinent part: RATING ASSESSMENT: . . . . The offeror has proposed a network comprised of an FDDI [Fiber Digital Data Interface] backbone with Ethernet sublans. However, this backbone does not take full advantage of the dual ring FDDI technology due to reliance on Ethernet connections to corporate processors and use of two separate brouters to connect the communities-of- interest to each other and to corporate storage. Single points of failure could affect up to 320 users. The proposed workstation solution did not meet the standard for standalone operation or administrative workload, but did exceed the standard for MS-DOS performance. The offeror's stated administrative workload for the corporate storage is one hour, which if correct would exceed the standard. However, the offeror failed to address functions which are associated with corporate storage (i.e., system administration), resulting in a "met the standard" rating. WEAKNESSES: - The network design proposed has single points of failure which could affect large (300+) user groups . . . . - The network design proposed provides for only a single Ethernet connection between JSAN and JIMS. . . . . . . . - As proposed, only 14 of 27 applications will run on the workstation in standalone mode . . . . - An administrative workload of 20 hours per week for 20 workstations is required . . . . RISK ASSESSMENT: Overall risk for this item is moderate. The offeror did not fully address the system administration requirements. His estimate of 20 hours per week to administer a 20 user community-of-interest translates to 1600 hours per week for a 1600 user network, or a 40-man workforce simply to administer JSAN workstations. . . . The single connection between JIMS and JSAN is a critical point which could isolate JIMS users from JSAN users in the event of failure. Protest File, Exhibit 2823 at 88-89; Transcript at 241. The workstation factor summary for Grumman which was not included in the SSEB report revealed that Grumman had failed to meet two out of three of the standards and exceeded the third standard. Protest File, Exhibit 2780. Evaluation of Grumman's Proposal Against the Pertinent Standards for System Architecture Grumman failed the third standard under the communications network factor which was met "when no single point of failure affects no more than 30 users." Protest File, Exhibit 2752. A single point of failure in Grumman's system could affect up to 320 users. Id. Standard 1 for workstations was stated to be met when fifteen-twenty applications run in the standalone mode. Protest File, Exhibit 2752.[foot #] 8 Grumman failed this standard because fourteen applications, some only partially, ran in standalone mode. Id. Standard 2 of the workstation factor evaluated the amount of time required to administer centrally a twenty-user community of interest. Protest File, Exhibits 2752, 2753. The standard was met if the time required was one to two hours per week. Grumman's proposal stated the time as twenty "hours," but did not specifically state whether that was manhours or system hours. Protest File, Exhibit 2849A at I-J-116. Specifically, Grumman's Technical Proposal included the following provision: | | Network Time | | Nightly | Tape Backup | Tape Backup | Weekly Time | | | for Daily | Network Time | Computer Time | Time for | Time for | to Create | | | Incremental | to Copy Daily | to Analyze | Daily W/S | Nightly | W/S Disk | | No. of | Backups | W/S Security | Security | Security | Security | Images on | | Workstations | from W/S | Audit Trails | Audit Trails | Audit Trails | Audit Trails | Server | | 20 | 39 Sec | 0.1 HR | 20 minutes | 6 minutes | 5 minutes | 1 Hour | | 178 | 340 Sec | 0.7 HR | 3 hours | 8 minutes | 30 minutes | 8 Hours | Figure J2.5-2 Server Time Summary for Workstation Backup and Security Administration Administration Impact and Time Requirements Administering a 20 workstation COI will require approximately 20 hours a week plus time required to review any security related reports. A portion of this time will be dedicated to development activities which is in support of the full 1600 workstation network. User impact would result primarily from server upgrades. This equates to an estimated one hour nightly and one or two weekends per year for operating system enhancements. Network impact will result primarily from backups, however, careful scheduling of these backup activities can greatly minimize any affects on throughput. ----------- FOOTNOTE BEGINS --------- [foot #] 8 The Air Force did not perform any analysis in determining whether fifteen or any other number would provide substantial benefit to the Government. Transcript at 111-12. The evaluators had to "pick a certain point at which [they] were going to give credit between 0 and 27." Id. at 112. ___ ----------- FOOTNOTE ENDS ----------- Protest File, Exhibit 2849A at I-J-116 (emphasis added). Although Grumman intended the twenty hours referenced in line two above to mean network time or system processing time and not manhours, the Air Force interpreted the proposal to mean manhours. Transcript at 262. However, Grumman never said anywhere in its proposal the total number of manhours it would take for system administration functions. Transcript at 719. The Air Force did not send Grumman any DRs or questions relating to this number of manhours. Id. During the hearing, Grumman's engineering manager for JSAN identified three functions of system administration requiring human intervention and assigned a total number of manhours per week to perform each such function for a twenty user COI as follows: optimizing file space - 2.3 hours, software updates - 4 hours, security administration (unrelated to reviewing security related reports) - 3.5 hours. Transcript at 719-26. This would indicate that Grumman's manhours per week for system administration should have been 9.8 hours.[foot #] 9 The Evaluation of Contel's Proposal as to System Architecture The SSEB report item summary for Contel's system architecture included the following: RATING ASSESSMENT: The offeror met the standard for the communications network factor, met the standard for the workstation factor, and exceeded the standard for the corporate storage approach factor. The offeror proposes an FDDI backbone with Ethernet segments to support communities-of-interest, corporate processors, and corporate storage. The network is designed with a good amount of redundancy to minimize failures. . . . The offeror's approach to managing corporate storage minimizes the need for administrator intervention. STRENGTHS: . . . . ----------- FOOTNOTE BEGINS --------- [foot #] 9 This witness' testimony on this point, however, was inconsistent and confused. Transcript at 719-26, 752-75. Apparently, at his deposition he had testified that the manhours per week should have been 6.1; at the hearing he said the 6.1 meant machine hours, then said that this too was an error. Id. ___ at 753-60. What the witness did seem certain of was that the 20 hours "should have been machine hours for a 178-user community." Id. at 756. ___ ----------- FOOTNOTE ENDS ----------- - Provides the capability to run 24[[foot #] 10] of the 27 desired applications in standalone mode. . . . - System administrator functions should take no more than 8 hours per week to administer corporate storage. . . . WEAKNESSES: - The proposed network design includes single points of failure which could affect up to 192 JSAN users (i.e., router failure). . . . - The offeror stated that workstation administration would take 5-8 hours per week to administer a 20 workstation group. . . . Protest File, Exhibit 2823 at 108. Like Grumman, Contel exceeded the first standard and failed the second standard under the communications factor. Protest File, Exhibit 2753. Contel also failed the third standard, but less egregiously than Grumman since a single point of failure could affect up to 192 users. Id. Contel's factor summary for the workstation factor indicated that Contel met this factor even though Contel had failed two out of three standards and exceeded the third. Protest File, Exhibit 2781. The factor summary stated: Offeror meets the standard in the workstation factor. Offeror chose to only provide 9 applications of 27 which does not meet the standard. However, offeror [h]as provided an option to add 15 additional applications if desired. The MS-DOS mode exceeded the standard performance factor. In the system administration area offeror does not meet the standard. Offeror states an average of 5-8 hours weekly to administer a COI of 20 workstations. Overall, the risk associated with this factor is moderate. Protest File, Exhibit 2781. Offerors' performance at the LTD was also considered in the evaluation. E.g., Protest File, Exhibit 2823. In discussing Grumman's performance of the word processing demonstration at the LTD, the Air Force noted: ----------- FOOTNOTE BEGINS --------- [foot #] 10 As the Air Force later realized, Contel only offered nine applications in the standalone mode, not twenty- four; the other fifteen applications were options. Transcript at 116-18. We credit the testimony of the SSEB chairman that the SSAC and SSA were told Contel only offered nine. Id. ___ ----------- FOOTNOTE ENDS ----------- There are many instances (see output results) that would aptly demonstrate the user UNfriendliness of this software. Little items like page numbers, line numbers, column position indicators and the like that didn't work properly made it more difficult to work through the document. As demonstrated, there are indications that requirements in the RFP aren't being met (see results also). It is not apparent if this is actually the case or if the demonstrators themselves are not "expert" enough to demonstrate the full capabilities of the software. Id. at 75. The Offerors' Compliance with B2 Security Requirement Both Grumman and Contel satisfied the minimum mandatory B2 security requirement. According to Grumman's JSAN program director, there is a very limited market of vendors available to deliver trusted systems of any kind, especially B2 systems. Transcript at 591.[foot #] 11 The Air Force had concerns about all vendors' ability to meet the B2 requirements. Transcript at 309. Neither Grumman nor Contel had an extant B2 product which satisfied JSAN requirements when BAFOs were submitted. Protest File, Exhibit 2823; Transcript at 311-14, 328-42, 907-22. Contel, although it was using the TMach as a starting point for its developmental efforts, had not yet decided what specific products it was going to offer for this delayed deliverable. Transcript at 926-27. Grumman was evaluated as having sufficiently met the security standards, receiving a technical item rating of green/moderate risk, while Contel failed to receive any evaluation credit for this item being rated yellow, high risk. Protest File, Exhibit 2823 at 91. Contel's lead security engineer for JSAN testified that Contel itself has "several labs" for security products in which it evaluates trusted products and tests them. Transcript at 904, 907. This engineer had himself previously participated in evaluating a B3 NATO command and control system as a team member. Id. at 905. On its proposal team Contel had its manager of security engineering who had previously been manager of TCB development for a company while that company was developing a B3 product which is presently in formal evaluation at the B3 level. ----------- FOOTNOTE BEGINS --------- [foot #] 11 According to Grumman's chief security engineer, only four systems in total have ever reached the B2 formal evaluation phase at NSA. Transcript at 808. These took from four-seven years to get to the formal evaluation phase. Id. at ___ 809. ----------- FOOTNOTE ENDS ----------- Id. at 907. Contel also had two IBM security engineers as "regulars" on its security team. Id. at 908. Contel's lead security engineer explained why, in his view, Contel could meet the B2 requirement: [I]t was a consensus of the team and the experience of people that have worked on B3 systems, people who have gone through the evaluation process before. Also, it was based on the knowledge of what we currently had as a basis, an engineering basis for the product. We had a significant amount of what we proposed already in the process of being developed and we identified, at a high design level, what had to be performed, what had to be modified in the OSF-1 product to get us to the B2 solution. Q What are the characteristics of your solution, if any, that led your team to conclude that this could be accomplished? A . . . [W]e had a basis in hand that was to be built upon. We had a port to the right operating system. We had a port to the right platform. That port included B1 functionality. The research had already been done on that operating system as to how to bring it up to B2. There was a prototype that we could look at, if we wanted to, as a basis. We were ending up with the right programming interface, the right user interface. I don't want to say that the job was going to be easy, but it was certainly -- there was no reason to believe it couldn't be done in two years. Transcript at 922-23. Contel's certification, which was signed by its Acting President, provided: I hereby certify, to the best of my knowledge and belief, that, if awarded the contract, Contel will be able to supply the B2 TCB within 24 months of contract award, assuming that the NCSC accepts the B2 TCB for evaluation and conducts the evaluation in a timely manner consistent with the schedule as stated in the Government's response to vendor question 59. Protest File, Exhibit 2849. The SSEB chairman determined that Contel's certification was conditional, because Contel could not certify that NSA would finish the evaluation on time. Transcript at 217-20. The SSEB chairman testified as follows: We looked at this certification, and although it has a condition, what that condition is, what it says to me is if you do what you said in the RFP, or if NSA does what you said they were going to do for us, then we're going to meet your requirement. Transcript at 219.[foot #] 12 The SSEB report included the following item summary for Contel's security: RATING ASSESSMENT: This offeror fails to meet security standard 2 of 3. No B2 TCB components are in or beyond the "Vendor Assistance Phase." For this reason, plus the uncertainty surrounding the development of the Trusted Mach OS for the B2 TCB (i.e., no supporting technical documentation), this proposal is rated yellow. STRENGTHS: - Single homogeneous system which reduces user training, system administration, maintenance, and the integration of different system components. - Transition from Security Phase 2 to 3 involves only replacement of the Secure AIX with the TMach. - Phase 3 is a single network integrating the SCI and Collateral environments. WEAKNESSES: - Lack of validatable evidence that the B2 TCB will meet JSAN requirements. - Reliance on certification, instead of supporting technical literature, to demonstrate ability to produce a product. - The security components proposed are still under development (no submission to NCSC as of yet) leaving much to faith. ----------- FOOTNOTE BEGINS --------- [foot #] 12 Later in his testimony he said, "So what I read the condition as saying is, just acknowledging the same thing we tried to acknowledge in the requirement." Transcript at 220. ----------- FOOTNOTE ENDS ----------- Protest File, Exhibit 2823 at 110. The Air Force was also skeptical about Grumman's ability to produce an integrated and easily manageable B2 secure system. Transcript at 249. The SSEB report reflected this concern: RATING ASSESSMENT: Although the security solution appears sound, the evaluators are skeptical about the offeror's ability to produce an integrated and easily manageable B2 TCB. The offeror provided "last minute" solutions for integrating the logon and audit capabilities only after the insistence of the evaluators. They also continue to propose the use of the XTS-200 even after B2 functionality has been achieved, seemingly indicating an inherent lack of faith in the security reliability of the proposed B2 TCB. STRENGTHS: - The architecture, when implemented, could meet JSAN Phase 3 security requirements, to include NSA evaluated TCB components, way ahead of schedule. The products are available for evaluation, and represent proven technology. Key architectural components are homogeneous even though the overall architecture is heterogeneous. The proposed component mix minimizes time and cost factors while transitioning to Security Phase 3. WEAKNESSES: - Integration of B2 TCB components to a JSAN B2 TCB. Although the offeror agreed belatedly to integrating component logon and audit capabilities into unified system capabilities, the overall system integration and user friendliness aspects of the solution were only superficially addressed. - Continued use of XTS-200 in Security Phase 3. It appears to add unnecessary complication to system design with questionable benefit to JSAN security. Protest File, Exhibit 2823 at 91. The Air Force's security item chief testified regarding Grumman: A There was one requirement they didn't meet -- they couldn't. Q And in that particular requirement what did Grumman do? A They provided certification to meet that requirement. Q Do you know what that requirement was and who the certification was provided by? A The -- under the Grumman proposal, one of their -- one of their pieces of the B2 TCB was the XTS 200, working under the stop operating system. The XTS 200 -- well, the RFP required it to be 128 compartments for information for the system. The XTS 200, under the stop operating system, could only provide 64. It's a physical limitation. And it was currently under evaluation at NSA, currently in the formal evaluation stage with this configuration. At the face-to-face meeting in July, we told them that they were in jeopardy of being thrown out of competition, based on the fact they couldn't meet this one specification. And they replied back with a certification letter from the vendor, Honeywell Federal Systems, that they would [meet] it when it came time to produce the B2 TCB. Transcript at 318-19. The supplier of the high speed processor for Grumman's solution provided the following certification: This letter is to certify that Alliant Computer Systems will supply Grumman Data Systems with a version of Concentrix that is a native UNIX System V, Release 4. Alliant's scheduled upgrade to System V Release 4, is based on AT&T's schedule. Alliant's FX/2800 system requires System V Release 4 ESMP (Multiprocessor, B-2 level secure) UNIX from AT&T. AT&T's planned released date of System V Release 4 ESMP is mid-1993. Alliant will commit to a "not to exceed development period," after the availability of this release from AT&T. Protest File, Exhibit 2912. Grumman also submitted the following certification from AT&T: The purpose of this letter is to certify that AT&T will comply with the RFP requirement to have System V Release 4ES complete evaluation on the AT&T hardware platforms for JSAN or be in Phase 3 (Formal Evaluation) within 60 days of a delivery order under CLIN 1090. . . . . It would be AT&T's intention to use either RAMP or Porting to extend the Release 4ES B2 rating to the JSAN hardware platforms. However, if neither of these programs is viable for meeting the RFP requirement, AT&T will initiate the evaluation process sufficiently in advance of the exercise date for CLIN 1090 that the 60 day requirement will be met. Protest File, Exhibit 2912. The Cost Evaluation The SSEB report contained a cost evaluation report with a ranking of BAFOs as well as separate cost analyses for each offeror which included a breakdown for, inter alia, hardware, maintenance items and software. Protest File, Exhibit 2823. The ranking was: Proposal PV PV % Above CD Current Code Rank Dollars Ranking 1 Rank Dollars [Grumman] 1 $57,677,052 ----- 1 $ 69,476,354 [Contel] 2 $91,584,742 58.8% 2 $114,381,666 [ ] 3 $93,829,206 62.7% 3 $117,890,868 Id. at 113. The Briefing to the SSAC On November 26, 1991, the SSAC received a briefing on this procurement. Protest File, Exhibit 2825; Transcript at 169. The members were told that all three offerors met the minimum mandatory requirements and were eligible for award. Protest File, Exhibit 2825 at 13. The technical chief's area ratings were included in the SSEB report which was disseminated to the SSAC. Protest File, Exhibit 2823. The SSEB Report contained detailed evaluation data down to the level of the six item summaries for each of the three offerors in the competitive range. Protest File, Exhibit 2823. The report also contained cost evaluation data for the three offerors, concluding that with the possible exception of one CLIN each for Grumman (2600 -- developed software maintenance) and Contel (1090--security) all prices offered were reasonable and realistic.[foot #] 13 Id. ----------- FOOTNOTE BEGINS --------- [foot #] 13 General comments on Grumman's cost included: CLIN 1480, Documentation Conversion Utility, price appears to reflect the server license change made at BAFO and was evaluated as correct. However, the (continued...) ----------- FOOTNOTE ENDS ----------- Other than reciting the actual evaluations of technical and cost for each offeror, the SSEB report did not discuss price/technical tradeoff. Protest File, Exhibit 2823. The SSAC unanimously agreed with everything in the SSEB report. Transcript at 252. As the minutes of the SSAC briefing reflect, there were formal technical proceedings followed by formal cost proceedings. Protest File, Exhibit 2828. The entire minutes regarding cost stated: a. There was a general comment on what causes the disparity in prices? ANSWER: There are a number of possible factors, to include the hardware discount for Vendors Q and E (41% vs 21%), as well as an offeror's individual solution to the technical requirements and the offeror's basic pricing strategy to try to win the contract. b. There seemed to be discrepancies between the Cost and Technical areas (i.e. the cost did not seem consistent with the technical solution). These areas must be made clear when the selection decision is made. ANSWER: It will be made clear by the SSAC in report/briefing to the SSA. c. Are there any hidden costs concerning CLIN 1090? ANSWER: No. d. There was much discussion on the User Support Area. What are we getting for the money? How many people/teams? ANSWER: All vendors met minimum mandatory. The price doesn't necessarily reflect the level of effort, as pricing must be taken as a whole for a given proposal. e. Why is the ICE substantially different from final cost? ANSWER: The ICE is a conservative estimate. The average discount is usually 40-60% below GSA, an important element of the ICE. Further, offerors can negotiate multi-user licenses for software that are significantly cheaper than the individual licenses used to come up with the ICE. ----------- FOOTNOTE BEGINS --------- [foot #] 13 (...continued) maximum number of users columns [sic] on Table B-10 was not changed. Assuming the maximum number of users columns [sic] is correct and the pre-BAFO quantities should be used, the overall evaluated cost would be about $3.5M higher for this offeror. Id. at 121. This was apparently not reflected in Grumman's ___ evaluated cost. Id. at 113. ___ ----------- FOOTNOTE ENDS ----------- f. The Council stressed that the two areas, Technical and Cost, were prioritized by the SSAC when the acquisition process began and the decision was that Technical was more important than Cost. The SSAC revalidated the priority of Technical merit over Cost. Id. The minutes contained the following discussion of user friendliness: There was considerable discussion on the definition of "User Friendliness" and the difference between yellow and green in this area. Colonel Hawley [from the Joint Staff] emphasized the Joint Staff's overriding emphasis on User Friendliness as the critical keystone of JSAN. Because the typical Joint Staff officer is a fighter pilot, tank driver, or ship driver, would the command for operating the system be intuitive to him? ANSWER: [Contel] will provide what was stated in the evaluation standards and that a basic user could figure it out with little training. [Grumman] and [ ] proposed systems that meet only the minimum requirements and will take longer to learn. Protest File, Exhibit 2828. In the area of system architecture, the significant impact of a single point of failure in Grumman's system, i.e., that it would affect 320 users, was noted while a single point of failure in Contel's system would affect 192 users. Id. There was also concern expressed over the administrative workload of twenty hours per week in Grumman's solution. Id. The Briefing to the SSA In briefing the SSA, the SSEB chairman had provided primarily the same information as had been given to the SSAC except for adding discussion of what transpired at the SSAC briefing. Transcript at 183. The SSEB chairman testified: For example, I think in the system architecture ar[e]a, I briefed not only the same information that I briefed the Advisory Council, but also that we had had discussions on what exactly it meant to have something available in the stand-alone mode, what it meant, again, specifically that the 9 versus 24, the 9 plus 15 in the optional are non-priced. I tried to make a specific point that we could only give credit for that which was priced; also that there had been discussion on the points of failure and what it meant to be isolated from network services. Id. The SSA was told that, looking at the most probable present value life cycle cost, Grumman would cost $57,677,052 and Contel would cost $91,584,742, 58.8 percent more than Grumman. Protest File, Exhibit 2831 at 38;[foot #] 14 Transcript at 486. A most probable current dollar life cycle cost evaluation was also presented; Grumman's cost was $69,476,354 and Contel's $114,381,666, 64.6 percent more.[foot #] 15 Id. at 39. The Selection The SSEB report was given to the SSA, and he based his decision on that report, the minutes of the SSAC briefing, and information conveyed to him during a briefing on December 4, 1991, including slides reduced to hard copy. Transcript at 482-83, 498, 520. The SSA reviewed the line item costs for both the Grumman and Contel proposals in the SSEB report, but he did not specifically equate a cost difference with any technical differences. Transcript at 510. The SSA found it significant that both Grumman's and Contel's cost proposals fell below the ICE. Id. at 511-12. At the end of the briefing it was apparent to the SSA "that there was a clear winner in the technical area." Transcript at 535. He explained: There was no doubt in my mind, but I didn't want to just call it at that time. I wanted to make certain that I was comfortable with that decision and that it met all the requirements that we had been charged to do, and that was to determine the best value for the Government; in other words, what are we getting for what we are spending and to make sure that we properly and carefully measured how well each of the offerors met the requirements of the proposal and we met -- and we did the "how well" part by using the evaluation criteria against the standards that were set. Transcript at 535-36. In describing the technical superiority of Contel, the SSA testified: Q Did you have any other understanding as to why . . . in the overall technical scoring in those two areas Contel received a green? ----------- FOOTNOTE BEGINS --------- [foot #] 14 The third offeror's present value cost was $93,829,206, 62.7 percent above Grumman's. Id. ___ [foot #] 15 The third offeror's current dollar cost was $117,890,868, 69.7 percent above Grumman's. Id. ___ ----------- FOOTNOTE ENDS ----------- A I do. There was a significant difference in the way those scores were meted out. If I recall correctly, the -- several of the -- three of the, if I get the word right, considerations were scored as minus or yellow on the GDS [Grumman] proposal, and those three were barely in the marginal area. They were very low marginal. And that was a driving factor. Excuse me, four. I said three, I meant four. One met standards, four fell well below standards. And that was a driving factor in the overall marginal assessment assessed against GDS. Q You mean, and the marginal means the yellow? A The yellow, excuse me. Yes. Q Okay. And, in terms of . . . the green for Contel, you understood that they were evaluated as providing more extras that received [e]valuation credit? A Extras in the sense that they were determined to show significant contributions to the Government, in the area -- in the one consideration that exceeded standard, and the other consideration that met standards. The three that -- three other considerations that did not meet standards were marginally below the standard level, in other words, very close to meeting standards. Q Okay. And you're speaking now about the user- friendliness area? A I'm speaking of the user-friendliness and the system -- and the system -- excuse me, the use interface integration portion. Transcript at 491-92. Both during the briefing and afterwards when he deliberated on the selection decision, the SSA "continued to recall the overall importance of user friendliness and system architecture - - because these two . . . plus the transition plan are the three things that most impact the day-to-day operations of the system . . . and having a usable product in the hands of the Joint Staff." Transcript at 534. The SSA did not describe any of the specific technical characteristics in Contel's proposal as to user friendliness, system architecture or transition which he considered to be superior to Grumman's and to justify a higher cost. Transcript at 478-544. Regarding a cost-technical tradeoff, the SSA testified: Q Sir, focusing on the amount by which Contel was perceived to be better than Grumman in those first two technical areas, namely user friendly and system architecture, is it fair to say that you never tried to associate a specific dollar amount corresponding to that perceived technical gap if you will; is that right? A That is correct. . . . . Q [T]here was no training cost difference that was ever developed, right? A That's correct. Q Sir, you were give[n] a technical briefing and you were given the cost briefing, right? A Correct. Q But you were never given a briefing which weighed the various costs and benefits that might be associated with these offerors against each other, right? A That is correct. . . . . JUDGE WILLIAMS: General, this is by way of clarification. I know that you testified in response to your questions by Mr. Winik that no briefing was given you regarding weighing of a cost-benefit, but did you yourself perform a cost-benefit analysis in your deliberations? THE WITNESS: No, I didn't. . . . . Q I just want to make sure we're clear, General. You indicated in response to the Judge's question that you performed no cost-benefit analysis. I just want you to, to make sure we're talking about the same thing, reconcile that with other testimony that you considered cost and that you were being [sic] a best value analysis. Could you just tell us exactly the extent to which cost was in there or -- A Yes, I'm sorry. Q -- not in there . . . -- . . . . THE WITNESS: So in terms of a cost-benefit analysis, no. What I was -- but did I discount the cost data? I did not discount it, and I certainly considered it and weighed that, and . . . I thought through that again to make sure that we were in fact getting the best value, because best value really is, in my mind, what are we getting, we, the Government, getting for the money spent, and in this case, we are getting a very -- a clearly superior technical product worth, in my estimation, the additional costs. Q But you did not quantify a dollar value on a factor-by-factor consideration? A No, I did not. Transcript at 492, 498-99, 502, 543-45. When asked: "What role if any did costs have in the decision making process?" the SSA answered: They were -- I considered them. I weighted against the independent -- the Government's independent cost estimate. I fully appreciated that it was noticeably, considerably above [Grumman's] offering, but there was such a clear distinction in the technical area with technical being over cost, again, that I was -- that it was determined by me to be the best value for the Government. Transcript at 537. The SSEB chairman had prepared three selection decision documents for the SSA's signature each making the best case for award to a different offeror. Transcript at 186-87. The SSA, having decided on Contel (without knowing its identity) signed the letter prepared for Contel, making no substantive changes, thus adopting the letter. Transcript at 187, 529. The Source Selection Decision Document states, in part: 2. After examining the proposals and analyzing the evaluation results, I selected Contel based on the factors listed in the provisions of the solicitation entitled "Basis of Award." Consistent with those provisions, Contel's offer constitutes the best value for the government. 3. The major strengths of the successful proposal are the user friendliness of the software system, the application software desired functionality proposed, the reduction in system administration workload, and the plan for limiting transition impact on the current users. Major risks are the use of a developmental B2 security solution, timely availability of the full software capabilities, and a lack of experience maintaining or integrating Wang equipment, lack of large network maintenance experience, and lack of experience providing security systems at the JSAN level. This proposal was the best choice even though it did not represent the lowest present value cost because the improved technical solution over the lowest cost proposal was significant in the most important evaluation items and represents a system far mor[e] capable of meeting the Joint Staff's automation requirements. Protest File, Exhibit 2832. Discussion Did the Air Force Fail to Make a Meaningful Cost Benefit Analysis? Grumman claims that the Air Force never attempted to analyze the technical differences between the two proposals or to assess whether those differences were worth a 58.8 percent higher cost. Grumman further argues that the technical differences were minor and did not justify selection of the higher-priced offeror. The Air Force and Contel counter by urging the Board to refrain from second guessing the SSA's decision, arguing that the decision was not irrational. Respondent's Posthearing Brief at 27-28; Intervenor's Posthearing Brief at 48-49. We recognize that traditionally agencies have been accorded great discretion in performing technical/cost tradeoffs where the solicitation does not set forth the specific weights to be applied. E.g., TRW Inc., GSBCA 11309-P, 92-1 BCA 24,389 at 121,790, 1991 BPD 205 at 14 (citations omitted). This does not mean, however, that we may shrink from our responsibility to determine whether the agency made a cost/technical tradeoff analysis and whether it was adequate. Especially in an instance such as this, where both protester and the awardee met the mandatory technical requirements, the enhancements in technical approach were not numerically scored or otherwise quantified, and protester offers a cost evaluated to be 58.8 percent lower -- some $35 million -- we will scrutinize whether the preponderance of the evidence in the record shows that an adequate analysis was undertaken. There is a basic problem in attempting to review the SSA's cost/technical tradeoff decision here. The record contains no explanation of why the SSA concluded that the technical enhancements in Contel's proposal warranted a 58.8 percent higher price. We have quoted at length from the documents provided to the SSA, his testimony and the decision itself to provide the reader with the full extent of the cost/technical tradeoff analysis done. We are left however with no meaningful description, quantification or reasoned analysis of the increased "value" of Contel's superior user friendliness or system architecture over Grumman's. Contel contends that our decision in Oakcreek Funding Corp., GSBCA 11244-P, 91-3 BCA 24,200, 1991 BPD 156, suggests that we will accept generalized explanations by source selection officials for cost/technical tradeoff rationales. Intervenor's Posthearing Brief at 55-56. We will not. In Oakcreek, we sustained award to a higher priced, technically superior proposal because we found that the selection official justified the price premium; technical/cost was weighted 80/20, and the selection official testified that the additional cost of the winning proposal "was justified by the reduced risk . . . associated with its extensive, long term, highly successful track record and the significant superiority of its technical proposal". Oakcreek, 91-3 BCA 24,200 at 121,041, 1991 BPD 156 at 9-10. We expressly recognized in Oakcreek that "[e]ven when the RFP states that award will be made to the offeror with the highest point score, such an award must be justified in light of any extra expenditure required." Id. at 121,041, 1991 BPD 156 at 9 (citations omitted). See also, CRC Systems, Inc., GSBCA 9475-P, 88-3 BCA 20,936, 1988 BPD 136 ("These two decisions [DALFI, Inc., GSBCA 8755-P, 87-1 BCA 19,552, 1986 BPD 228, motion for reconsideration denied 87-1 BCA 19,584, 1987 BPD 15; and DALFI, Inc., GSBCA 8975-P, 87-3 BCA 20,018, 1987 BPD 131, motion for reconsideration denied, 87-3 BCA 20,070, 1987 BPD 141] . . . clearly evince the Board's insistence that the responsible selection officials apply a reasoned analysis in weighing the costs to be saved against any technical advantages to be obtained."). The Comptroller General has also determined that "before a contracting agency can award to the higher priced . . . technically superior offeror, the contracting agency is required to justify such award in light of the extra expenditure required." Reliability Sciences, Inc., B-205754.2, 83-1 CPD 612 at 8 and cases cited therein; University Foundation, California State University, Chico, B-200608, 81-1 CPD 54 at 3- 4. We recognize that the justification in the record necessary to support award to a higher priced vendor necessarily will vary with procurement circumstances. However, the kind of superficial lip-service which the Air Force paid to cost here is wholly unsatisfactory. Even where a solicitation simply states that technical is more important than cost -- and nothing more about the relative importance -- cost cannot be treated lightly. 10 U.S.C. 2305(b)(4)(D). As we noted in Pyramid Technology Corp., GSBCA 8743-P, 87-1 BCA 19,580 at 99,023, 1987 BPD 11 at 11: Congress long ago rejected proposals that would have given contracting officers authority to ignore price considerations in negotiated procurements, and thus the [Competition in Contracting Act] is another legislative confirmation that price is a mandatory consideration in any source selection. Indeed, in circumstances similar to this case, we recently concluded that an agency performed an inadequate price/technical tradeoff analysis and discounted price beyond the degree envisioned in the RFP. International Business Machines Corp.; Lockheed Missles & Space Co., GSBCA 11359-P, 11362-P, 1991 BPD 227[foot #] 16 (IBM/Lockheed). Although the decision remains protected, we explained our basic rationale: The RFP entitled the IRS to emphasize technical over cost, but not to the degree that it actually did based on the justifications that the [source evaluation board] cited. The selection decision therefore, unsupported as it is by an adequate price/technical tradeoff analysis, violated the requirement that the Government make award in accordance with the solicitation. Id. at 2-3. On reconsideration, we dismissed arguments that we had forced agencies to "close the 'price gap'" by quantifying the precise individual technical differences between proposals; prohibited agencies from relying on nonquantifiable factors in determining best value to the Government; established some sort of minimum cost ratio; and required the assignment of dollar values to specific technical differences. IBM/Lockheed, GSBCA 11359-P-R, 11362-P-R, 92-1 BCA 24,438 at 121,946, 1991 BPD 273 at 3. We explained: The basis of our decision was and remains that the solicitation required price to be a factor, though not the most important one, in the award decision, and that the IRS's explanation of its decision indicates that price was not a factor in any significant way. We did not grant the protest because the IRS failed to follow any of the so-called "principles" outlined above. We granted it because the IRS's tradeoff analysis fails to indicate that the Government will receive benefits commensurate with the price premium it proposes to pay. How such benefits might be demonstrated lies entirely within the IRS's discretion . . . . ----------- FOOTNOTE BEGINS --------- [foot #] 16 The RFP in IBM/Lockheed indicated that ____________ technical features were more important than cost, but that award would not be made at a significantly higher overall cost to achieve slightly superior technical features. ----------- FOOTNOTE ENDS ----------- Id. at 121,946, 1991 BPD 273 at 3 (footnotes omitted). Here, as in IBM/Lockheed, there was no meaningful analysis of the relative benefit of Contel's proposal versus the increased cost. The SSA here had not been provided with, and did not himself make, any reasoned analysis of how "the Government will receive benefits commensurate with the price premium it proposes to pay." As we stated in TRW Inc., GSBCA 11309-P, 92-1 BCA 24,389 at 121,789, 1991 BPD 205 at 14, "the SSA must determine what the difference in technical score represents, and then decide whether the additional technical merit is worth the additional cost." Our recent decision in Computer Sciences Corp., GSBCA 11497-P (Dec. 30, 1991), 1992 BPD 6 (CSC), is not inconsistent with this result. There, we upheld a cost/technical tradeoff which was based upon a detailed analysis of the clear superiority of the awardee's technical features and an extensive analysis of costs by the SSAC and the SSA. Id. at 34. There, unlike here, the SSAC provided the SSA with a "best value analysis," including costs associated with certain key technical features in each proposal. We expressly found: "The SSA decided to buy the technically superior but higher priced system in the expectation that the greater cost would pay for itself in the increased efficiency of RCAS personnel and avoidance of massive upgrades over the twelve year life cycle." Id. at 34-35. No such cogent "best value" rationale was articulated here. Nor does our decision in Honeywell Federal Systems, Inc., GSBCA 9807-P, 89-1 BCA 21,444, 1989 BPD 23, compel a different result. In that case the Board expressly found as fact that: The selection official made the selection after considering the relative technical merit and cost of each proposal. He determined that the AT&T proposal was superior technically to protester's proposal and that the technical difference merited the additional cost. The most important differences between protester and AT&T, in his view, were in the integration and user friendliness areas--ease of use, integration, ergonomics. Id. at 108,044, 1989 BPD 23 at 24 (emphasis added).[foot #] 17 ----------- FOOTNOTE BEGINS --------- [foot #] 17 We do not read Honeywell to suggest that _________ whenever all acceptable offers come in lower than the ICE, the SSA is free to discount cost. Such a procedure would violate the statutory mandate of 10 U.S.C. 2305(b)(4)(D) that price/cost be evaluated in every procurement. Nor did the Board indicate that (continued...) ----------- FOOTNOTE ENDS ----------- Based on the record,[foot #] 18 the Board in Honeywell ruled: Although protester alleges the source selection authority made no determination that AT&T's proposal was worth the price difference between protester and AT&T, this was precisely the determination he made. Id. at 108,045. In sum, we conclude that even when technical is a more important factor than cost, the Government in performing a "best value" analysis, should examine and document whether the extent of technical superiority warrants a premium in price. Recognizing that there may be nonquantifiable elements involved in any such examination, the selection nevertheless must reflect an in-depth analysis of the circumstances which justify the tradeoff. The Air Force violated the statutory mandate in 10 U.S.C. 2305(b)(4)(D) as well as the terms of the RFP by failing to consider price adequately, and in particular whether the price premium of some $35 million was warranted. Did the Air Force Evaluate Proposals Against Undisclosed Evaluation Standards? Grumman contends that the Air Force evaluated proposals against additional technical requirements not set forth in the RFP which were more demanding than, or unrelated to, the requirements in the RFP and that only when measured against these enhanced secret requirements was Contel scored higher than Grumman. AFR 70-30 requires that the Air Force establish "standards" against which to evaluate technical proposals and prohibits disclosing them to offerors. Grumman, which was familiar with ----------- FOOTNOTE BEGINS --------- [foot #] 17 (...continued) because the prices came in below the ICE, cost could be ignored. The Board said: "In these circumstances [the SSA] was free to place significant weight on technical excellence provided any difference in technical merit was worth any increase in cost." ___ 89-1 BCA at 108,045, 1989 BPD 23 at 26 (emphasis added). [foot #] 18 While the procurement sensitive details were not revealed in the Honeywell decision due to their protected _________ status, this is not to say they were overlooked. We recognize that in Honeywell the costs of the proposals were not reported _________ and the details of the SSA's analysis not recounted. Nonetheless, given the firm conclusion that the SSA in fact performed an adequate cost/technical tradeoff analysis, we have no reason to question its factual predicate or to infer that an inadequate factual predicate existed. ----------- FOOTNOTE ENDS ----------- the provisions of AFR 70-30, knew that under this regulation the Air Force would create undisclosed standards against which proposals were to be evaluated. Grumman does not ask the Board to invalidate AFR 70-30. Rather, Grumman contends that the Air Force acted illegally here because the requirements against which the Air Force measured proposals were not truly "standards" within the meaning of AFR 70-30 but rather were specific capabilities the Air Force wanted. Grumman gives three examples of this evaluation technique: (1) the RFP required maximum independence from the network for each workstation, but the Air Force evaluated whether there were at least fifteen software applications available in standalone mode; (2) the RFP required offerors to "minimize single points of failure" in the system, but the Air Force evaluated the number of persons who would be affected by a single point of failure, not the number of points of failure; (3) Grumman understood that the requirements for a "user friendly" system were those spelled out in section C of the RFP, but there was an undisclosed quantitative standard requiring eight "intuitive" software applications. We are thus asked to determine whether these three standards truly measured "minimum performance or compliance" with solicitation requirements or imposed hidden additional requirements. As to the requirement for maximum independence from the network, we cannot say that a standard specifying fifteen software applications in the standalone mode was a secret additional requirement. Rather it appears to us to be precisely the kind of undisclosed evaluation standard the Air Force -- without challenge from any offeror -- permitted itself to devise. The number of software applications available in the standalone mode is an objective measure of the independence of a workstation from the network. The same analysis applies to the number of intuitive software packages available. The first consideration of the user interface/integration standard within the user friendliness item in Section M of the RFP was "the degree to which software applications are intuitive." The standard simply set the acceptable number of software applications to meet that requirement -- exactly what AFR 70-30 envisions and what offerors should have known would occur. Grumman contends that minimizing single points of failure is very different from limiting the number of people affected by a single point of failure. Grumman then complains that proposals were given evaluation credit only if "no single point of failure affected no more than thirty users" -- claiming that this was a secret requirement since the RFP only called for minimizing the points of failure. Grumman is correct that the number of persons permissibly affected by a single point of failure was nowhere disclosed to offerors. However, the fact that limiting the number of persons affected was of importance to the Air Force in conjunction with single points of failure was disclosed in the solicitation. The first question to be completed by offerors in Section L.21 expressly asked: "Describe failure points and the expected number of users affected if failure occurs." Protest File, Exhibit 54 at 284. Vendors were thus put on notice that both might be considered in minimizing single points of failure.[foot #] 19 In this context, specifying the number of persons permissibly affected simply afforded the Air Force more of an objective yardstick with which to measure the extent of compliance. Did the Awardee Fail to Meet a Mandatory Security Requirement in the RFP? Grumman contends that Contel did not meet the solicitation's requirement to provide a "Trusted Computing Base" (TCB) that would be in the National Security Agency's Trusted Products Evaluation Program at the B2 security level within twenty-four months and sixty days of contract award. Grumman argues that Contel did not even propose a specific product to meet this requirement, that Contel never provided product literature or other information which demonstrated that it could meet the requirement and that Contel's certification did not satisfy the RFP. Specifically, Grumman contends that Contel failed to comply with the certification requirement because: (1) Contel itself certified that it would comply with B2 even though it was not the "source" of the product and (2) the certification was conditional. Protester's Posthearing Brief at 41. We reject these contentions. Offerors were required to have a B2 level TCB in the formal evaluation process at NSA two years after contract award. No offeror had an extant B2 product that would satisfy the JSAN requirements. By its very nature a B2 TCB for JSAN is a state-of-the-art developmental undertaking. Thus, the lack of product literature and proof that a compliant product would be delivered did not render Contel's proposal fatally defective. Protester has not shown that Contel's system is incapable of ever reaching a B2 level of trust. CSC, 1992 BPD 6 at 36. Nor is the fact that Contel itself certified that it would comply with B2 a sufficient ground to reject its solution as noncompliant. At the time it submitted its certification, Contel had an engineering basis for the product and had not consummated any vendor contracts for its B2 TCB; Contel viewed itself as the source of the product. Although deemed to be compliant for proposal validation purposes, both Grumman's and Contel's certificates, in one form or another, were "conditioned." Thus, all offerors were treated in like fashion by the Air Force. In Xerox Corp., GSBCA 9862-P, ----------- FOOTNOTE BEGINS --------- [foot #] 19 Offerors were expressly advised in Section L that replies to the questionnaires would be used in evaluating proposals. Protest File, Exhibit 54 at 282. ----------- FOOTNOTE ENDS ----------- 89-2 BCA 21,652 at 108,922, 1989 BPD 68 at 20, we recognized that solicitation requirements should be read in the least restrictive manner in order to further the statutory purpose of permitting a wide variety of products or services to qualify for award. Similarly, in Telos Field Engineering, GSBCA 11516-P (Dec. 30, 1991), 1991 BPD 355, neither offeror complied with the literal requirements concerning educational qualifications. The Board said: "[T]here can be no prejudice to a party when all proposals are subjected to an interpretation that uniformly allows less than strict compliance, is not arbitrary or capricious, and will afford the Government its actual needs." Id. at 11. Did the Air Force Conduct Adequate Discussions with Grumman? Grumman alleges that the discussions which the Air Force conducted with it were deficient in two ways: (1) the Air Force downgraded Grumman in several areas which it never raised with Grumman but did raise with Contel and (2) the Air Force downgraded Grumman's proposal in an area in which it should have sought a clarification. Protester's Posthearing Brief at 83. The Board has addressed the Government's obligation in conducting discussions as follows: While the FAR requires discussions about deficiencies, at the same time it prohibits helping an offeror bring its proposal up to the level of other proposals through successive rounds of discussions, such as by pointing out weaknesses resulting from the offeror's lack of diligence, competence or inventiveness, otherwise known as technical leveling. 48 CFR 15.610(d)(1) (1988). In conducting negotiations on complex systems, procurement officials must navigate between Scylla and Charybdis, and cautiously address "deficiencies," but not "weaknesses," or otherwise engage in technical leveling. Planning Research Corp., GSBCA 10472-P, 90-2 BCA 22,798, at 114,485, 1990 BPD 62 at 20-21. See also CSC, 1992 BPD 6 at 36. An agency is "required to exercise its discretion in conducting negotiations and its judgment must be respected and sustained unless clearly defective." Advanced Technology, Inc., GSBCA 8878-P, 87-2 BCA 19,817, at 100,271, 1988 BPD 67. Areas Allegedly Never Raised with Grumman Specifically, Grumman argues that the Air Force allowed Contel and no other offeror to submit a certification from itself and not the source of supply, that it permitted Contel to certify that it would use its best efforts to comply with the B2 requirement. Second, Grumman claims that the Air Force never revealed that its main concern with single points of failure was the number of users that would be affected by a single failure. Id. at 86. Third, Grumman claims that it was downgraded for offering only a single connection to the JIMS/JSAN gateway -- when Contel did the same thing but received a DR and changed its proposal adding a second gateway and using a different computer - - a circumstance which Grumman did not know was permissible. Grumman suggests that because the Air Force accepted Contel's B2 security certification, which was not from the source of supply and was conditional, it somehow failed to conduct adequate discussions with Grumman. The focal point of discussions is to apprise an offeror of deficiencies or uncertainties in the offeror's own proposal which preclude meaningful evaluation; the function of discussions is not to divulge problems in a competitor's proposed solution. Grumman here was given every opportunity to address the Air Force's concerns with its security proposal. Nor can we conclude that as a result of its discussions with Contel the Air Force lowered the requirements for security and failed to disclose that to the other offerors. The RFP's requirements for B2 security were applied evenly recognizing that such was a delayed deliverable and given the vagaries of the NSA process and the state-of-the-art nature of the requirement. As was proper, the B2 requirement was applied in the least restrictive way, and Contel was properly downgraded in the evaluation, but not eliminated. Nor were the discussions regarding the single point of failure issue inadequate. The Air Force asked Grumman how many users would be affected by a single point of failure in a way similar to the way in which it asked Contel. The fact that Grumman changed its architecture after this DR was issued but perpetuated the problem in its new approach does not require the Air Force to reiterate the identical concern in a new DR. Grumman's suggestion that the Air Force did not adequately advise it of the deficiency in its proposal due to the single gateway is also misplaced. Grumman and Contel received similar notification in the respective DRs. Grumman claims that Contel was permitted to substitute a computer different than that authorized by the RFP, the VS 300, for the gateway to JIMS. Grumman contends this violates the requirement for meaningful discussions because it was never told such substitution would be acceptable. Grumman's contention is based upon its overly restrictive interpretation of an illustrative diagram of the JIMS/JSAN transitional period in the RFP. Protest File, Exhibit 54 at C23. While this diagram showed the WANG VS 300 as an existing JIMS gateway through which connections could be made, the diagram also showed "other connected internal and external systems." That there was another existing WANG processor, the VS 10000, was no secret. There was no prohibition in the solicitation against using the Wang VS 10000, the existing database processor, as an additional gateway. The Air Force thus had no obligation to advise Grumman that it could use that computer as an additional vehicle to connect JIMS/JSAN. Should a Clarification Request Have Been Issued Regarding Grumman's System Maintenance Time? Grumman contends that its proposal was misinterpreted by the Air Force as stating that it would take twenty hours of human time to administer the system when it intended twenty hours of system time. Grumman asserts that the Air Force knew that this figure was unusually high and that given the 500-odd DRs which were issued it should have been advised of this in a clarification request. Protester's Posthearing Brief at 87. We disagree. Clarification as defined by Federal Acquisition Regulation 15.601 "means communication with an offeror for the sole purpose of eliminating minor irregularities, informalities or apparent clerical mistakes in the proposal." 48 CFR 15.601 (1991). Grumman's failure to specify that the "20 hours" meant system time did not fall into any of those three categories. Nor should it have been caught as an obvious mistake, given that the RFP was reasonably interpreted as requiring manhours and that Grumman failed to say what the human time to administer the system per week was anywhere else in its proposal. Decision The protest is granted in part.[foot #] 20 We revise the Air Force's DPA to direct the Air Force to consider whether the specific technical enhancements offered by Contel justified the 58.8 percent higher cost, articulate and document the rationale for its conclusion, and either confirm award to Contel or terminate that award and then proceed with the procurement in accordance with statute and regulation. ----------- FOOTNOTE BEGINS --------- [foot #] 20 Intervenor's motion to dismiss certain protest counts for failure to state valid basis for protest is denied.