THIS OPINION WAS INITIALLY ISSUED UNDER PROTECTIVE ORDER, RELEASED TO THE PUBLIC IN REDACTED FORM ON JANUARY 13, 1993, AND IS BEING RELEASED IN ITS ENTIRETY ON MARCH 22, 1994 DENIED: November 20, 1992 GSBCA 11939-P GRUMMAN DATA SYSTEMS CORPORATION, Protester, v. DEPARTMENT OF THE AIR FORCE, Respondent, and CONTEL FEDERAL SYSTEMS, INC., Intervenor. Irwin Goldbloom, Peter L. Winik, David R. Hazelton, James H. Barker, and Curtis P. Lu of Latham & Watkins, Washington, DC, and Patrick O. Killian of Grumman Data Systems Corporation, Bethpage, NY, counsel for Protester. Clarence D. Long, III and Joseph M. Goldstein, Office of the General Counsel, Department of the Air Force, Washington, DC, and Richard C. Bean and Paul E. VanMaldeghem, Hanscom AFB, MA, counsel for Respondent. William H. Butterfield, Jed L. Babbin, Charlotte F. Rothenberg, and Lori Beth Feld of McGuire, Woods, Battle & Boothe, Washington, DC, counsel for Intervenor-Respondent. Before Board Judges NEILL, HYATT, and WILLIAMS. WILLIAMS, Board Judge. This protest filed on July 23, 1992, is the second protest lodged by Grumman Data Systems Corporation (Grumman) against the procurement for a network for the Joint Chiefs of Staff known as the Joint Staff Automation for the Nineties (JSAN). In the initial protest, JSAN I, we concluded that the Air Force in awarding to Contel Federal Systems, Inc. (Contel), failed to perform an adequate cost-technical tradeoff (CTTO) analysis to determine whether paying a price 58.8 percent higher than Grumman's, a difference of some $35 million, was warranted. We therefore revised the Air Force's delegation of procurement authority (DPA) directing it to consider whether the specific technical enhancements offered by Contel justified the higher cost, to articulate and document the rationale for its conclusion, and either confirm award to Contel or terminate that award and proceed in accordance with statute and regulation. In an effort to comply with the Board's directive, members of the Source Selection Advisory Council (SSAC) and the Source Selection Evaluation Board (SSEB) prepared a comparative cost- technical tradeoff (CTTO) analysis, which delineated certain "quantifiable" and "nonquantifiable" discriminators between the two proposals. According to the analysis, the quantifiable discriminators indicated that the Contel proposal provided an estimated $42.25 million of added value to the Government. The Source Selection Authority (SSA), after considering both this quantifiable analysis and stressing the nonquantifiable discriminators, concluded that the added value of Contel's system to the Joint Staff was worth the additional cost. Grumman has challenged the cost-technical tradeoff analysis on two major grounds. First, Grumman contends that the CTTO failed to follow the request for proposals' (RFP's) evaluation scheme in that offerors were only supposed to receive evaluation credit when they met or exceeded previously established evaluation standards by offering components, features, and capabilities which provided "substantial benefit" to the Government in accordance with the technical items and considerations delineated in the RFP. Grumman argues that under the solicitation, only components, features, or capabilities which received evaluation credit in the technical evaluation could be cited to justify paying a price premium in the CTTO analysis. Thus, according to Grumman, technical areas either where offerors failed to meet or exceed the requirements, or where there were no corresponding standards, could not properly be considered in the tradeoff analysis. Protester's Posthearing Brief at 17, 22, 29-32. Second, Grumman contends that even if the Air Force could depart from the RFP's evaluation scheme, the cost-technical tradeoff analysis was unreasonable in numerous respects. Grumman challenges the Air Force's analysis of quantifiable and nonquantifiable discriminators in user friendliness, systems architecture and transition, but argues that even if these discriminators are valid, the award must be set aside because the Air Force failed to discount the quantifiable discriminators. Grumman claims that the Air Force erred in calculating the quantifiable discriminators to be $42.25 million. Using this number, the Air Force had determined that Contel's proposal was a better value by $8.35 million -- $42.25 million minus the $33.9 million difference between Contel's and Grumman's evaluated prices. Grumman points out that the Air Force failed to apply the MPC (most probable cost) and PV (present value) discounts to the $42.25 million figure even though these discounts had been applied to the evaluated prices which resulted in Contel's price being $33.9 million higher. Grumman contends that once the discounts are subtracted, the $42.25 million becomes $18.72 million; when this is subtracted from the $33.9 million, the Grumman proposal represents a better value based on the quantified discriminators of $15.18 million. Further, recognizing that the SSA testified that this reduction of the quantified discriminators would not have changed his selection due to the tremendous impact of the nonquantifiables, Grumman urges the Board to dismiss this testimony as an improper post hoc justification. Finally, Grumman claims that the Air Force's and Contel's trial experts "performed a new cost-technical tradeoff analysis to augment the one at issue" and asserts that this new analysis is not legally cognizable, because the standard of de novo review does not permit such post hoc rationalizations. Protester's Posthearing Brief at 5. Although we agree that the CTTO analysis was flawed in certain respects, we nonetheless conclude that, based upon the record as a whole, the decision to award to Contel at a price premium of some $33.9 million is justified. Findings of Fact On January 22, 1991, the Air Force issued RFP number F19630-90-R-0006 for JSAN. Protest File, Exhibit 54.[foot #] 1 The procurement was to integrate existing computers and associated peripherals known as JIMS (Joint Staff Information Management Systems) with the contractor- provided JSAN system, and manage the transition between JIMS and JSAN with minimal disruption to the Joint Staff.[foot #] 2 Id. The RFP contemplated a 4- to 5- year transition period involving 200-400 users per year. Id. at 47. The Joint Staff is comprised primarily of action officers, individuals who receive "constant taskings from senior ----------- FOOTNOTE BEGINS --------- [foot #] 1 The Board accepted the entire protest file from the JSAN I protest in this proceeding and will refer to it simply as "Protest File." The additional protest file in this proceeding will be designated Protest File JSAN II. [foot #] 2 JIMS is an "umbrella term" which defines "a wide variety of hardware and software and operates as a series of parts rather than a unified whole." ----------- FOOTNOTE ENDS ----------- management" in a variety of areas. Transcript at 442.[foot #] 3 Their work on the JSAN system will include preparation of briefings, charts, and graphs for presentation to Congress and senior officers in both domestic and international matters. Id. There will be a significant amount of reworking of documents and constant interruptions. Id. at 443. The RFP envisioned the award of an indefinite-delivery, indefinite-quantity type contract (IDIQ) for a base period and seven option years. The contract was to be fixed price for all Contract Line Item Numbers (CLINs) except the labor intensive CLINs, which would contain an economic price adjustment (EPA) provision. Id. at Executive Summary. This procurement was conducted under Air Force Regulation (AFR) 70-30, Streamlined Source Selection Procedures, and was administered by the Air Force Computer Acquisition Center (AFCAC) at Hanscom AFB in Massachusetts. Protest File, Exhibit 114 at 1. AFCAC employed the alternative source selection organization using an SSEB, an SSAC, and an SSA. AFR 70-30 required the Air Force to establish objective evaluation standards in addition to the requirements and evaluation criteria set forth in the RFP, but prohibited the Air Force from disclosing such standards to offerors. AFR 70-30 provides in pertinent part: a. The Technical Team will establish objective standards at the lowest level of subdivision of specific evaluation criteria. b. Standards, which indicate the minimum performance or compliance acceptable to enable a contractor to meet the requirements of the solicitation and against which proposals are evaluated, will be prepared for the lowest level of subdivision within each area of the specific evaluation criteria and be approved by the SSET chairperson. . . . Air Force Federal Acquisition Regulation Supplement, Appendix BB, 6 CCH Government Contracts Reptr. 40,953.90 at 28,983. In accordance with AFR 70-30, prior to the issuance of the RFP, AFCAC and the Joint Staff developed standards which were approved by the SSAC. Transcript JSAN I at 35, 41-42. The Solicitation User Support ----------- FOOTNOTE BEGINS --------- [foot #] 3 References to the transcript in this protest will be "Transcript"; references to the transcript in JSAN I will be "Transcript JSAN I." ----------- FOOTNOTE ENDS ----------- Paragraph C1, Introduction, Section 1.1, of the RFP provided, in part: The Contractor shall act as the single integrator for this contract, integrating the Contractor-provided computer system into the Joint Staff's existing data processing environment. As integration requirements evolve that are not specifically identified in a current CLIN, the Government will issue task orders to the Contractor under the Support Personnel CLINs (7510- 7650) to meet those requirements. Protest File, Exhibit 54 at 39. Paragraph C.15.7 set forth the requirements for a user support team as follows: The Contractor shall provide persons with the skills to provide users with the technical assistance in the areas of systems orientation, systems initialization, follow-on training, system capabilities awareness, and other skilled areas desired to enhance the utilization of the JSAN system. The user support team shall staff a JSAN User Support Team Center providing system administration capabilities (backups, restorations, archiving, etc.) for all corporate JSAN components, operations, and maintenance of all JSAN components, installation of new equipment, and assistance in converting existing JIMS documents to the new formats for JSAN, as well as maintaining effective configuration control of all JSAN components. The user support team shall be on site. The Government will provide phone service. The User Support Team Center will operate from 0800 to 1700, Monday through Friday. Id. at 153. Table B-18 of the RFP required offerors to price CLIN 7650 as follows: CLIN 7650, User Support Team (per month): $ __________ Id. at 37. Table M-1 of the solicitation, the Project Implementation Schedule, set forth the months for which user support could be ordered as follows: CLIN CM 3 FY92 93 94 95 96 97 98 99 7650 24 48 24 12 12 6 6 6 Id. at 332; Transcript at 167. According to the Air Force's program manager, these "months" listed under the various fiscal years were user support "team" months, so that in FY 1992 the offeror was giving a price for two teams and in FY 1993 for four teams. Transcript at 175. Training Paragraph C.14.7 of the solicitation mandated that the contractor provide, at a minimum, training in the form of specified courses described in CLINs 7010 through 7410.[foot #] 4 Protest File, Exhibit 54 at 136-37. Each of these CLINs described the course and specified its approximate duration. CLIN 7010, a course entitled "User Overview," was described as introducing a noncomputer-literate user to the office automation environment and lasting approximately three days. Id. at 136. CLIN 7110, "Network Overview," was to teach users the network, its features, and functions and was to last two days. Id. at 137. CLIN 7160, "Transition to JSAN" was to transition an experienced user into the JSAN environment, covering features of all office automation tools used by eighty percent of the Joint Staff, and was also to last two days. Id. The Requirements for Workstation Response Time In generally describing the equipment being procured for JSAN, Paragraph C.10 stated in pertinent part: C10 CLINs 0001-0920, Equipment. The following are the minimum characteristics, capabilities, and capacities of the equipment that the Contractor shall provide, install, and maintain under this contract. . . . The Government requires four logically different types of computing support for the Joint Staff: a. Communications processing subsystem b. High speed processing subsystem c. Database processing subsystem d. Office automation processing subsystem These different types of computing support may be satisfied by differing configurations of physical processors. The Contractor is permitted to combine the differing types of processing in order to provide the best mix of processing units. The resulting mix of processors shall provide the required throughput, response time, backup, and expansion capabilities for each type of computing support. ----------- FOOTNOTE BEGINS --------- [foot #] 4 The solicitation also delineated a number of general requirements for training, including Paragraph 14.2.4, which permitted commercially available training courses that address the specific functions/features of JSAN components (software/hardware) to be provided under this contract. ----------- FOOTNOTE ENDS ----------- Protest File, Exhibit 54 at 62-63 (emphasis added). Paragraph C.10.7, CLINs 0610-0799, Workstations and Associated Equipment, provided in part: The workstations and associated peripheral equipment and software acquired under this contract shall provide a flexible, reliable, integrated, and expandable platform as well as a standard user software interface to support the mission activities of the Joint Staff users. Protest File, Exhibit 54 at 71. CLIN 0610, mandated that the "Basic Workstation" operate at a minimum sustained performance rate of 6 SPECmarks, and meet the response times in Attachment 4 to the RFP. Id. at 74. CLIN 0620, mandated that the "Advanced Workstation" operate at a minimum sustained performance of 10 SPECmarks, and meet the response times in Attachment 4. Id. Attachment 4 to the RFP, "System Performance Requirements," set forth certain "Response Time Requirements" for various tasks. For example, the attachment stated: RESPONSE TIME REQUIREMENTS Within a networked environment with the workloads and user mix indicated in this attachment (with surges of up to 50%) and 8 concurrent broadcast video sessions, the Contractor shall meet the following response times: NOTE: All response times are measured from the time the user strikes the key to execute a function until the action is completed and a new command can be entered. 1. Response times: AVERAGE 9 9 % o f Attempts (To user at the workstation) (Time in Seconds) Initiate any Contractor 1 2 provided application Complete loading an 30 40 application from network storage to a workstation Display a full-screen 1 1 raster image CLIN 1000 - Operating System/System Administration: . . . Password entered to 10 12 applications avail. Format up to 1.44 MB 90 95 (3 1/2") of floppy media Format 15MB of hard 60 65 drive media Add new user 120 150 (exclusive of data entry) Delete user 20 25 User time required for 0 0 workstation admin. Id., Attachment 4 at 3. Software Upgrades Paragraph C.11 generally described the software requirements and contained the following provision for upgrades: C11 CLIN 1000-1590, Software. The Contractor shall provide software for all processing subsystems, LANs, and workstations as specified in this section. . . . The latest released version of software shall be provided with support documentation in accordance with Contract Data Requirement List (Exhibit A, CDRL A0001). All subsequent upgrades of software, along with support documentation, shall be provided to the Government and installed as they become available. . . . A hardware or firmware solution is an acceptable substitution for a software function. Protest File, Exhibit 54 at 87. Technology Improvement The RFP contained the following Technology Improvement clause: 5.7 Technology Improvement. The Contractor shall establish and implement a technology improvement program that maintains JSAN as a state-of-the-art system employing current, available technology throughout the life of the contract. Integration of all technology improvements recommended by the Contractor will be approved at the option of the Government (see H42 for procedures concerning contractor proposed improvements). Approval will be based upon economic benefits, performance capabilities, and cost. The Contractor shall provide a Technology Improvement Plan (Exhibit J, CDRL J001). Protest File, Exhibit 54 at 50.[foot #] 5 Evaluation Paragraph M.2 "Basis of Award" provided that once a proposal had been determined to have met all mandatory solicitation requirements and to have complied with law: the Source Selection Authority (SSA) will determine the proposal which provides the best overall value to satisfy Air Force needs. This selection will be based on how well each proposal satisfies the evaluation criteria as described in paragraph M3 and also on an integrated assessment of the proposals submitted in response to this RFP. Id. at 318. Paragraph M.3, "Evaluation Criteria," stated: 3.1 General. The evaluation will be based on an integrated assessment of the offeror's proposal. The integrated assessment of the offeror's proposal will address mandatory requirements, terms and conditions, technical factors, life cycle cost, and level of Live Test Demonstration (LTD) performance. The offeror's proposals will be evaluated and rated by the Government. This assessment will address two areas (listed in order of importance) of consideration which are: Technical Cost ----------- FOOTNOTE BEGINS --------- [foot #] 5 Paragraph H42.1, Improvements, provided: ____________ 42.1 After contract award, the Government may solicit, and the Contractor is encouraged to propose independently, improvements to the hardware, software, services, features or other requirements of the contract. These improvements may be proposed to save money, to improve performance, to save energy or for any other purpose which presents an advantage to the Government. As part of the proposed changes, the Contractor shall submit a price proposal to the Contracting Officer for evaluation. Those proposed improvements that are acceptable to the Government will be processed as modifications to the contract. Protest File, Exhibit 54 at 218. ----------- FOOTNOTE ENDS ----------- Id. at 319. The RFP did not state how much more important technical was. In the area of technical evaluation, the solicitation provided for "evaluation credit" as follows: Offered components, features and capabilities beyond the minimum mandatory requirements specified in the RFP, Section C references below will be evaluated depending upon the degree to which they benefit the Government. To receive evaluation credit the offered additional component, feature and/or capability beyond the minimum requirements must substantially benefit the Government as determined by its contribution to the specific items, factors, and considerations identified below. Protest File, Exhibit 54 at 319 (emphasis added). The six technical items were listed in descending order of importance: User Friendliness System Architecture Transition Security Maintenance Management No weights were assigned. Id. at 319-23. The RFP provided that proposals would be evaluated in each technical and cost criterion for compliance with the requirement and soundness of technical approach. Id. at 319. For each of the six technical items, specific assessment criteria were then listed in descending order of importance, unless expressly stated to be otherwise. Id. User Friendliness: RFP Evaluation Provisions Within the first item, user friendliness, there were two evaluation factors expressly stated to be equal in importance: user interface/integration and application software functionality. Id. at 319-20. User interface/integration had five considerations: a. The degree to which software applications are intuitive.[foot #] 6 ----------- FOOTNOTE BEGINS --------- [foot #] 6 The solicitation defined intuitive as: A user's ability to rely on recognition rather than recall when using an application. The less recall required, the more intuitive the application. References are simple such that a user can use real world expectations and apply them to using the application. (continued...) ----------- FOOTNOTE ENDS ----------- b. The degree to which program function keys and/or menu interfaces are consistent across applications and keystrokes are minimized. c. The degree to which an UNDO feature is available within all applications. d. The degree to which the help and error messages are easily understood and context sensitive. e. The degree to which text and graphics can be integrated and moved between the various software applications. Id. at 320 (citations omitted). Application Software Functionality had one consideration: The number of desired capabilities provided by the offeror. Id. (citations omitted). System Architecture: RFP Evaluation Provisions Within the technical item system architecture there were three factors listed in descending order of importance: Communications network Workstation Corporate storage approach Id. at 320-21. Within these factors there were three considerations under both communications network and workstation and one under corporate storage. The pertinent consideration under communications network was "redundancy/availability of network . . . and how system prevents single points of failure." Id. at 320. The first consideration under workstation was "degree of functionality in standalone mode." Id. at 321. Transition: RFP Evaluation Provisions The solicitation provided that in general the transition item "is an evaluation of the degree to which the offeror's proposed system facilitates the transition to JSAN." Results of the Live Test Demonstration (LTD) were to be incorporated into the evaluation of this item. Protest File, Exhibit 54 at 321. ----------- FOOTNOTE BEGINS --------- [foot #] 6 (...continued) Id., Attachment 1 at 4. ___ ----------- FOOTNOTE ENDS ----------- There were two factors under this item, Transition Approach and Transition Impact. Under transition approach were two considerations: architecture approach for integration of JIMS/JSAN, and word processing and graphic file conversion. Id. Under the second factor, Transition Impact, there was one consideration, the degree of disruption for JIMS users during their transition to JSAN. Id. Factors and considerations were set forth similarly under other items. Id. The RFP: Cost Evaluation The RFP provided: M4 Contract Life Cycle Cost Evaluation. 4.1 General. The Government's Most Probable Cost (MPC) for the total life cycle of the contract will be determined by using the prices, terms and conditions of the CLINs/SLINs . . . in conjunction with their associated quantities, delivery dates, and probabilities of occur[r]ence as contained in the cost model. The Government's evaluation will be based on purchase only plans. In addition, an economic analysis that considers the current Treasury Bill Discount Rate will be conducted to determine the present value of monies. Id. at 324. The Original Evaluation Process The technical evaluation consisted of a two-step process: validation and evaluation. Transcript JSAN I at 151. Validation was a determination of whether a proposal met the RFP's minimum mandatory requirements and was eligible for award. Id. Following validation, evaluation of proposals against the standards' checklist commenced, and evaluators reviewed proposals as well as the offerors' responses to the questionnaires in Section L. Id. at 155. In order to receive evaluation credit for exceeding the minimum requirements, an offeror had to meet the standards. Id. at 36. Technical evaluators prepared an Evaluation Standards Checklist for each proposal and, for each standard, determined whether a proposal met (indicated by a checkmark), failed to meet (indicated by a -), or exceeded the particular standard (indicated by a +); these checklists also described the strengths and weaknesses of the proposal vis a vis each standard. Protest File, Exhibits 2736-39, 2752-54. Evaluation Results The pertinent color and risk assessments used in the evaluation ratings were defined to mean: Blue: Exceptional. Exceeds specified performance or capability in a beneficial way to the Air Force; and has high probability of satisfying the requirement; and has no significant weakness. Green: Acceptable. Meets evaluation standards; and has good probability of satisfying the requirement; and any weaknesses can be readily corrected. Yellow. Marginal. Fails to meet evaluation standards; and has low probability of satisfying the requirement; and has significant deficiencies but correctable. . . . . High Risk--likely to cause significant serious disruption of schedule, increase in cost, or degradation of performance even with special contract[or] emphasis and close Government monitoring. Moderate Risk--can potentially cause some disruption of schedule, increase in cost, or degradation of performance. However, special contractor emphasis and close Government monitoring will probably be able to overcome difficulties. Proposal Evaluation Guide (PEG), Protest File, Exhibit 114 at 12, 13. Contel was rated Green/Moderate Risk in the overall Technical Area whereas Grumman was rated as Yellow/High Risk. Protest File, Exhibit 2823. The technical items were individually rated as follows: Grumman Contel User Friendliness Yellow Green System Architecture Yellow Green Transition Yellow Yellow Security Green Yellow Maintenance Blue Green Management Green Yellow The risks for Grumman and Contel were as follows: Item Grumman Contel User Friendliness High Moderate System Architecture Moderate Low Transition Moderate Low Security Moderate High Maintenance Low Moderate Management Moderate Moderate Protest File, Exhibit 2825 at 30-37. The soundness for Grumman and Contel were as follows: Item Grumman Contel User Friendliness Marginal Acceptable System Architecture Acceptable Acceptable Transition Acceptable Acceptable Security Acceptable Marginal Maintenance Exceptional Marginal Management Acceptable Marginal Id. Evaluation of Grumman's Proposal as to User Friendliness The SSEB's item summary for user friendliness explained the rationale for Grumman's Yellow/High Risk rating as follows: Rating Assessment: This vendor was rated Yellow for the first factor (User Interface/Integration) and Blue for the second factor (Application Software Functionality). In the first factor the vendor exceeded 1 of the 5 standards and failed the other 4. Based on an overall assessment of the 5 standards and the relative degree of compliance (how close were those standards that failed and how well the one standard was exceeded) the vendor was judged to be Yellow for the factor. The fact that the vendor did very poorly and was not close in meeting any of the failed standards, the Vendor, although rated yellow was at a minimal level within that rating. A significant problem is that the character-based applications do not fully support the capabilities of the Graphical User Interface (GUI). The lack of GUI applications severely detracts from the ability to implement integrated suite of software applications. The second factor consisted of only 1 standard. The vendor exceeded this standard and was rated Blue for the factor. Even with equal weight given to each of the two factors, the minimal yellow rating of the second factor can not allow a higher rating than yellow for the item. The preponderance of potential risk for the overall implementation for this item lies with the first factor (User Interface/Integration). Therefore due to the increased risk, the item is given an overall risk rating of high. Protest File, Exhibit 2823 at 87. The Evaluation of Grumman's Proposal Against the Five Standards for User Interface/Integration Grumman failed the first standard under user interface/integration which was met when eight or more software applications were demonstrated to be intuitive. Protest File, Exhibit 2754. Six and one-half of Grumman's applications were rated as intuitive, making the rating a stated "20% below the minimum to meet the standard." Id. The most used programs for the Joint Chiefs are word processing, electronic mail, and composition graphics. Transcript JSAN I at 393. Grumman also failed the second standard for user interface/integration which was met "when 50-70% of applications demonstrated use the same function keys for similar operations; [n]o operations demonstrated require more than 4 keystrokes." Protest File, Exhibit 2754. Two evaluators submitted the same comment: Only 4 of 17 applications . . . use the same function keys for similar operations . . . . Whether the questionable . . . product plus twelve other products can provide a user friendly environment is highly risky. Id. Grumman failed Standard 4, i.e., whether "help messages are easily invoked [and] help and error messages are easily understood and context sensitive." Id. The evaluators concluded that not all software had context-sensitive help. Id. Standard 5 was whether "movement and integration of text and graphics can be accomplished." Id. One evaluator found Grumman met this standard, but the other determined Grumman failed this standard. Id. The Evaluation of Contel's Proposal as to User Friendliness The User Friendliness Item Summary for the Contel proposal stated, in part, the following: Rating Assessment: This vendor was rated Green for the first factor [Interface/Integration]. . . . In the first factor the vendor exceeded 1 of the 5 standards, met 1 standard and failed to meet 3 standards. Based on an overall assessment of the 5 standards and the relative degree of compliance (how close were those standards that failed and how well the standard was met or exceeded) the vendor was judged to be Green for the factor. The second factor consisted of only 1 standard. The vendor exceeded this standard and was rated Blue for the factor. However due to the marginal nature of the Green rating for the first factor, the overall rating for the Item is Green. The preponderance of potential risk for the overall implementation for this item lies with the first factor (User Interface/Integration). Therefore this item is rated overall moderate in risk. Protest File, Exhibit 2823 at 107. The Evaluation of Contel's Proposal Against the Five Standards for User Interface/Integration Contel met the first standard regarding intuitive software, having offered ten applications which were demonstrated to be intuitive. It received a "high checkmark" or high passing evaluation from one evaluator for this standard. Protest File, Exhibit 2746. Electronic mail, however, was noted to be cumbersome and a "substantial weakness" under this same standard. Id. Contel, like Grumman, failed the second standard because only three of seventeen applications used the same function keys for similar functions. Id. Both Contel and Grumman exceeded the third standard, which was met "when 80-90% of all commands demonstrated for UNDO can undo the previous command." Id.; Protest File, Exhibit 2754. Like Grumman, Contel failed standards 4 and 5. Protest File, Exhibit 2746. Evaluation of Grumman's Proposal in System Architecture In the second technical area, system architecture, Grumman was rated Yellow/Moderate Risk. Protest File, Exhibit 2823; Transcript JSAN I at 241. The SSEB Report provided, in pertinent part: RATING ASSESSMENT: . . . . The offeror has proposed a network comprised of an FDDI [Fiber Digital Data Interface] backbone with Ethernet sublans. However, this backbone does not take full advantage of the dual ring FDDI technology due to reliance on Ethernet connections to corporate processors and use of two separate brouters to connect the communities-of- interest to each other and to corporate storage. Single points of failure could affect up to 320 users. The proposed workstation solution did not meet the standard for standalone operation or administrative workload, but did exceed the standard for MS-DOS performance. The offeror's stated administrative workload for the corporate storage is one hour, which if correct would exceed the standard. However, the offeror failed to address functions which are associated with corporate storage (i.e., system administration), resulting in a "met the standard" rating. WEAKNESSES: - The network design proposed has single points of failure which could affect large (300+) user groups . . . . - The network design proposed provides for only a single Ethernet connection between JSAN and JIMS. . . . . . . . - As proposed, only 14 of 27 applications will run on the workstation in standalone mode . . . . - An administrative workload of 20 hours per week for 20 workstations is required . . . . RISK ASSESSMENT: Overall risk for this item is moderate. The offeror did not fully address the system administration requirements. His estimate of 20 hours per week to administer a 20 user community-of-interest translates to 1600 hours per week for a 1600 user network, or a 40-man workforce simply to administer JSAN workstations. . . . The single connection between JIMS and JSAN is a critical point which could isolate JIMS users from JSAN users in the event of failure. Protest File, Exhibit 2823 at 88-89; Transcript JSAN I at 241. The workstation factor summary for Grumman which was not included in the SSEB report revealed that Grumman had failed to meet two out of three of the standards and exceeded the third standard. Protest File, Exhibit 2780. Evaluation of Grumman's Proposal Against the Pertinent Standards for System Architecture Grumman failed the third standard under the communications network factor which was met "when no single point of failure affects no more than 30 users." Protest File, Exhibit 2752. A single point of failure in Grumman's system could affect up to 320 users. Id. Standard 1 for workstations was stated to be met when fifteen-twenty applications run in the standalone mode. Protest File, Exhibit 2752. Grumman failed this standard because fourteen applications, some only partially, ran in standalone mode. Id. Standard 2 of the workstation factor evaluated the amount of time required to administer centrally a twenty-user community of interest. Protest File, Exhibits 2752, 2753. The standard was met if the time required was one to two hours per week. Grumman's proposal stated the time as twenty hours, but did not specifically state whether that was manhours or system hours. Protest File, Exhibit 2849A at I-J-116. Although Grumman intended the twenty hours to mean network time or system processing time and not manhours, the Air Force interpreted the proposal to mean manhours. Transcript JSAN I at 262. However, Grumman never said anywhere in its proposal the total number of manhours it would take for system administration functions. Id. at 719. The Air Force did not send Grumman any deficiency reports or questions relating to this number of manhours. Id. The evaluation checklists comparing Grumman's offer against the workstation standards did not reference the SPECmark rating, since there was no standard for workstations which directly matched this requirement. Protest File, Exhibit 2752. The Evaluation of Contel's Proposal as to System Architecture The SSEB report item summary for Contel's system architecture included the following: RATING ASSESSMENT: The offeror met the standard for the communications network factor, met the standard for the workstation factor, and exceeded the standard for the corporate storage approach factor. The offeror proposes an FDDI backbone with Ethernet segments to support communities-of-interest, corporate processors, and corporate storage. The network is designed with a good amount of redundancy to minimize failures. The workstation proposed is binary compatible with the corporate processors. The offeror's approach to managing corporate storage minimizes the need for administrator intervention. STRENGTHS: . . . . - Binary compatibility among all workstations and corporate processors. - Provides the capability to run 24[[foot #] 7] of the 27 desired applications in standalone mode. . . . - System administrator functions should take no more than 8 hours per week to administer corporate storage. . . . WEAKNESSES: - The proposed network design includes single points of failure which could affect up to 192 JSAN users (i.e., router failure). . . . - The offeror stated that workstation administration would take 5-8 hours per week to administer a 20 workstation group. . . . Protest File, Exhibit 2823 at 108. Like Grumman, Contel exceeded the first standard and failed the second standard under the communications factor. Protest File, Exhibit 2753. Contel also failed the third standard, but less egregiously than Grumman since a single point of failure could affect up to 192 users. Id. The LTD Results Offerors' performance at the LTD was also considered in the evaluation. E.g., Protest File, Exhibit 2823. In discussing Grumman's performance of the word processing demonstration at the LTD, the Air Force noted: There are many instances (see output results) that would aptly demonstrate the user UNfriendliness of this software. Little items like page numbers, line numbers, column position indicators and the like that didn't work properly made it more difficult to work through the document. As demonstrated, there are indications that requirements in the RFP aren't being met (see results also). It is not apparent if this is actually the case or if the demonstrators themselves are not "expert" enough to demonstrate the full capabilities of the software. Id. at 75. ----------- FOOTNOTE BEGINS --------- [foot #] 7 As the Air Force later realized, Contel only offered nine applications in the standalone mode, not twenty- four; the other fifteen applications were options. Transcript JSAN I at 116-18. In JSAN I we credited testimony that the SSAC ______ and SSA were told Contel only offered nine. In the CTTO analysis the error that Contel offered twenty-four applications was repeated and the SSA took this wrong number into account, but could not say that it had a specific impact on his selection decision. Transcript at 816, 837-38. ----------- FOOTNOTE ENDS ----------- The Original Cost Evaluation The SSEB report contained a cost evaluation report with a ranking of best and final offers (BAFOs) as well as separate cost analyses for each offeror which included a breakdown for, inter alia, hardware, maintenance items, and software. Protest File, Exhibit 2823. The ranking was: Proposal PV PV % Above CD Current Code Rank Dollars Ranking 1 Rank Dollars [Grumman] 1 $57,677,052 ----- 1 $ 69,476,354 [Contel] 2 $91,584,742 58.8% 2 $114,381,666 [ ] 3 $93,829,206 62.7% 3 $117,890,868 Id. at 113. The SSEB Report The SSEB Report contained detailed evaluation data down to the level of the six item summaries for each of the three offerors in the competitive range. Protest File, Exhibit 2823. The report also contained cost evaluation data for the three offerors, concluding that with the possible exception of one CLIN each for Grumman (2600 -- developed software maintenance) and Contel (1090 -- security) all prices offered were reasonable and realistic.[foot #] 8 Id. The minutes of the SSAC briefing included the following comment: d. There was much discussion on the User Support Area. What are we getting for the money? How many people/teams? ANSWER: All vendors met minimum mandatory. The price doesn't necessarily reflect the level of effort, as pricing must be taken as a whole for a given proposal. Protest File, Exhibit 2828. This reflected a concern due to the $10 million price difference between Grumman and Contel in the ----------- FOOTNOTE BEGINS --------- [foot #] 8 General comments on Grumman's cost included: CLIN 1480, Documentation Conversion Utility, price appears to reflect the server license change made at BAFO and was evaluated as correct. However, the maximum number of users columns [sic] on Table B-10 was not changed. Assuming the maximum number of users columns [sic] is correct and the pre-BAFO quantities should be used, the overall evaluated cost would be about $3.5M higher for this offeror. Protest File, Exhibit 2823 at 121. ----------- FOOTNOTE ENDS ----------- user support CLIN, CLIN 7650. Transcript at 173-74. Grumman's BAFO price for CLIN 7650 was $6,700 per month; Contel's BAFO price for this CLIN was $115,141 per month. Transcript at 271, 305. The Original Selection The SSEB report was given to the SSA, and based on that report, the minutes of the SSAC briefing, and information conveyed to him during a briefing on December 4, 1991, he selected Contel. Transcript JSAN I at 482-83, 498, 520. The First Protest Grumman challenged the award to Contel on several grounds in Grumman Data Systems Corporation, GSBCA 11635-P. The Board granted the protest in part and revised the Air Force's delegation of procurement authority (DPA), directing it to consider whether the specific technical enhancements offered by Contel justified the 58.8 per cent higher cost, to articulate and document the rationale for its conclusion, and either confirm award to Contel or terminate that award and proceed in accordance with statute and regulation. The Cost-Technical Tradeoff Analysis Following the Board's decision, the SSA reconvened certain members of the SSEB and SSAC and directed them to conduct and prepare a cost-technical tradeoff analysis. Transcript at 59; Protest File JSAN II, Exhibit 6. The SSA directed the group to quantify the differences in the two proposals where possible. Id. Using the SSEB report as a starting point, and relying on the requirements stated in the RFP, the item chiefs compared the proposals against each other. Transcript at 61. During this phase, which took about two to four weeks, differences between the proposals, or "discriminators," were identified. Id. at 62. Subsequently, some of the discriminators were quantified. Id. at 62-63. In performing the CTTO analysis, the Air Force did not feel that it could only find an advantage for an item, component, or feature if that item, component, or feature substantially benefitted the Government as determined by its contribution to the specific factors and considerations identified in Section M. Transcript at 186-87. Rather, the program manager felt that Section M.3.3.1 of the RFP only came into play in the technical evaluation that was performed initially and not in the head-to- head comparison of proposals in the CTTO analysis. Id. at 187. In the written CTTO analysis, the Air Force summarized the enhanced value of Contel's technical proposal as follows: Although both proposals were responsive, they were not equivalent in their technical satisfaction of the requirement. Contel not only offered a "better" solution to the stated requirements, but their proposal greatly exceeded the minimum mandatory requirements in the areas deemed most important in the specified evaluation criteria. This analysis answers the question of whether or not the $92 million Contel proposal is "worth" $34 million more than Grumman's. Cost impacts fall into two categories: those for which estimated dollar values can be quantified and those which the SSEB could not quantify (see Exhibit 1). Every effort has been made to specifically allocate cost impacts to one of the six evaluation items specified in the RFP. However, several costs are associated with technical issues involving multiple items, and as such, a direct allocation methodology was not always possible. Based on the assessment of quantifiable cost/risk discriminators (see Exhibit 1), the adjusted cost of Grumman's proposal is computed at $99.95 million, $8.35 million greater than the Contel evaluated price of $91.6 million. Specific cost and technical issues associated with each Section M evaluation item are discussed in this document. Protest File JSAN II, Exhibit 6 at 1-2. The quantified discriminators were valued at a total of $42.25 million, but this figure was not discounted by either the MPC or PV discount factors, as the offerors' prices had been. Transcript at 66; Protest File JSAN II, Exhibit 6, Exhibit 1. The MPC discount took into account the probability of ordering certain quantities of various line items in each of the eight contract years. Transcript at 64. For example, in the first three months of the contract the likelihood of ordering workstations and operating systems was 100 percent, but that decreased to 70 percent in years one, two, and three and 50 percent in years four through eight of the contract. Id. at 64-65. As reflected in the initial SSEB report, the price differential between Grumman and Contel was $66 million if the MPC and PV discounts were not applied, and $33.9 million when these discounts were applied. Protest File, Exhibit 2823. The program manager for JSAN testified that, in hindsight, it would have been more accurate to apply the MPC discounts to applicable areas of the $42.25 million value attributed to quantified discriminators and to adjust the hourly rate and the number of users that would be expected to use JSAN. Id. at 56, 68-69. The Air Force depicted the quantifiable cost/value differential as follows: QUANTIFIABLE COST/VALUE DIFFERENTIAL IN MILLIONS Contel's Evaluated Price $91.60 Grumman's Evaluated Price $57.70 Proposal Amount Difference $33.90 Adjustments: Added Value of Selecting Contel's Proposal Quantifiable Factors: 1. User Friendliness A. Training Time: Time Away from Office $3.50 B. Time to Gain Proficiency $15.00 C. Cost of Classroom Training $0.30 2. Systems Administration A. Additional SA Support Requirements $10.45 B. Workstation Upgrades $7.20 3. Technical Support A. Technical Support not Included in Offer $5.80 Quantifiable Value Differential $42.25 Value Differential Less Proposal Amount Difference $8.35 Based on quantifiable discriminators, the selection of Contel's proposal is an $8.35 million better value for the Government. Protest File JSAN II, Exhibit 6, Exhibit 1. MPC and PV Discounts to Government Estimate According to respondent's expert's[foot #] 9 calculations, the total quantifiable discriminators' value under the original Government estimate was reduced from $42,287,000 to $19,730,000 when MPC and PV discounts were applied. Transcript ----------- FOOTNOTE BEGINS --------- [foot #] 9 Respondent's expert, a Senior Manager at Grant Thornton, was admitted as an expert in the fields of management information systems, mathematical analysis of business problems, and the dollar impact of those problems. Transcript at 1804-05, 1838-40. However, for purposes of identification in this decision, he will be referred to as respondent's expert. ----------- FOOTNOTE ENDS ----------- at 1851-52; Intervenor's Exhibit 11. According to protester's expert,[foot #] 10 the number was reduced to $18,720,000; the difference is due to computational approach, primarily because protester applied the MPC discount to the user support team, while respondent did not. Id. at 1852. The Quantified Discriminators for User Friendliness As the above chart reflects, the Air Force quantified three discriminators within user friendliness: 1) Time spent on formal (away-from-the-office) training, 2) time spent on on-the-job training, and 3) cost for classes. Protest File JSAN II, Exhibit 6, Exhibit 2. These discriminators did not equate to any specific factor or considerations in the RFP or to any specific undisclosed evaluation standards, but rather reflected differences between proposals which the technical item chiefs identified during a head-to-head comparison of the two proposals. Transcript at 441. Formal Training Time As reflected in Attachment A to this opinion, the Air Force divided users into two categories for purposes of formal training: 1) new users with no experience and 2) "current" users, those users familiar with Wang/Windows. Protest File JSAN II, Exhibit 6, Exhibit 2. The Air Force evaluators determined that it would take an average of five days to train new users on the Grumman system, and two days to train new users on the Contel system. Id. The Assistant Chief of the Office of Joint Manpower and Personnel Data Management (JMPDM), who is also the information systems manager for this Joint Staff Directorate, developed these formal training time estimates. Transcript at 352, 371. He currently provides technical assistance to the users in this Directorate. Id. at 418. This Assistant Chief, JMPDM, has served on the Joint Staff since 1990 and has an Undergraduate Degree in Education and a Masters Degree in Administrative Sciences - Information Systems Management from the George Washington University. Id. at 352, ----------- FOOTNOTE BEGINS --------- [foot #] 10 Protester's expert, a certified public accountant and formerly the Vice President and National Director of Petersen Consulting's Government Contract Practice, was admitted as an expert in (1) proper accounting practices generally; (2) proper accounting principles and practices related to Government contracts, including price analysis related to the evaluation of competitive proposals; (3) the statistical application of these accounting principles; and (4) accounting matters related to cost/technical tradeoffs, in particular in automatic data processing equipment procurements. We will refer to this expert as protester's accounting expert. ----------- FOOTNOTE ENDS ----------- 415. He also was an evaluator on the initial JSAN procurement and is very familiar with the software packages in the two proposals. Id. at 418-19. He has personally trained approximately a few hundred users of microcomputers and has put together training courses. Id. at 428. He explained his decision to quantify training time as follows: For me, based on my experience, my expertise, the best area for me to come up with some sort of a quantified number was in the area of the training and how that relates in the short term both in terms of the time for conducting the training and in the time for users to become familiar with the software to the point that they no longer fight it but they just use it. We didn't have anything that said this is the amount of training necessary other than what we had asked for in the RFP. We didn't have anything that said this is how long users are going to take to get up to speed except again my experience on the Joint Staff. The users that we have on the Joint Staff currently take five months, four to five months, to become familiar with the current system, which is fairly cumbersome to use, but in its entirety very similar in terms of complexity to that system that Grumman proposed. It takes much less time for our users to become familiar with their own work station, which is much more similar in terms of complexity to the Contel proposal. Id. at 429. The Air Force's Assistant Chief, JMPDM, further explained: We currently train four days in what we call a primary users course, which has operating the Wang software as well as how to use the user interface, and then how to use those two together in what minimal way as possible, and then we generally give an additional two half-day courses for standalone word processing and standalone graphics. That is approximately five days worth of training for new users coming to the Joint Staff. The bulk of that time is spent on learning how to use the Wang portion. Approximately two days of that time is spent learning how to use the standalone work station portion. Given that the Grumman solution was approximately as complex as our current solution, the training requirements are approximately the same. That was my best estimate. As far as the Contel solution, the training estimates are approximately what our current training requirements are for the standalone work station. Hence, the two days and the five days. Id. at 382. When asked what principal characteristics made the training estimate for the Grumman system the same as that for the JIMS Wang system, this estimator replied: The fact that the Wang system that we currently have has two distinct user interfaces, one for the Wang applications and one for the desktop standalone applications, and how that is exactly similar to the difference between the Uniplex applications and the rest of the Motif applications. If anything, because the way that the Uniplex applications use the mouse is even more different from Motif than the way the Wang applications allow you to use the mouse in the window, if anything it is worse than what we have. It is certainly at least as complicated. Q Are there other major factors that lead you to conclude that there would be an equation in the training time between the two? A The complexity and difficulty of using the software itself, the kinds of things that it takes to do, both systems require the user remember a lot of things. There are, if you go with shortcuts, and there are plenty in both, then you have to memorize all of those shortcuts. If you don't go with shortcuts, the menus, although they are on the screen and you can call them up, it is not readily apparent just from looking at the names of the top level menus which menu's tree you should go down to find the function that you want. Transcript at 470. In formulating the classroom training estimates the Assistant Chief, JMPDM, did not take into account the offerors' proposals for formal training classes required by CLINs 7010 through 7410. Id. at 373. He explained why: I did not rely on the amount of time because the amount of time proposed was what we said in the RFP was our estimate. Those times in the RFP were based on our estimate of what we thought it should take to train a system given that, currently, we train our users for four days when they walk in just to use the basic system, and then they get an extra couple of half-day sessions, and we expected that the system would be easier to use than what we currently have, so we estimated three days should probably be about right, and we put that in the RFP. The offerors came back and gave us classes that were three days long; that's what we asked for in the RFP, but there is no indication that there is any analysis on the part of the offerors to say, yes, three days is the right amount of time. They just proposed back to us in all cases what it was that we said was our estimate. Id. at 373-74. In estimating the formal training, the Assistant Chief, JMPDM, assumed that both offerors would propose adequate classes in terms of quality and effectiveness. Id. at 375, 377. The Air Force's program manager, who is also a user of the current system and who had training on this system, agreed with the two-day Contel and five-day Grumman formal training estimates. Transcript at 72. Protester's expert in human/computer interaction, including user productivity, user training, user interface design, and user interface analysis (protester's computer expert), found fault with the Air Force's assumption of a two-day formal training time for Contel and a five-day formal training time for Grumman, finding it hard to believe there could be anywhere near that much difference; he, thus, viewed the estimates as unreasonable. Transcript at 1265, 1289, 1318. He cited several reasons for his opinion. First, the expert stated that the organizations doing the training, i.e., Grumman and Contel, due to their experience, would have the best idea of how long it takes to train people on their systems and they each proposed a three-day course for new user training. Id. at 1318-19. Second, the magnitude of the difference between Contel and Grumman was extremely large -- two and one-half times longer to learn one system with essentially the same functionality. Id. at 1319. Third, half the software applications used by an end user who was not a programmer were encompassed in the same software packages from both offerors. Id. at 1321. This left the remaining applications, five software packages in Grumman's solution and eight in Contel's -- Grumman's proposed Uniplex is used for more applications. Id. at 1322. Focusing on word processing and E-mail, which were among the most frequently used programs, the-expert testified: I looked at basically E-Mail and word processing and I looked to see what the differences were in the systems that could account for large differences in learning time. And I've already mentioned the major differences that I saw which were in the fact that the menus in UNIPLEX work differently than the Motif menus, so that's a little bit more you have to learn, but I don't consider that to be a major learning thing that's going to take three days to learn how to use those menus. So, those differences seemed quite small. On the other side, the integration between the E-Mail and the word processing meant that many of the common E-Mail activities, writing letters, writing messages . . . in the UNIPLEX solution, because of the integration, its - - when you're writing a message, you can just use the word processor, so you can -- already know how to use that, and you can use the full functionality of the word processor. So, not a whole lot that you have to learn to use the E-Mail. Now, I couldn't get a real good reading on the Contel E-Mail because I didn't have a manual for that. It was not Asterix. But I did notice that it had been criticized in the SSEB as having cluttered screens and confusing menus. So, I felt that it couldn't be a superior product that was obviously much better in view of that criticism. So, if pressed, I would give maybe a slight edge on E- Mail to Grumman in terms of how easy it would be to learn, because you don't have to learn as many different things, a slight edge to word processing on Contel because you've got the Motif compliance, so there's a little bit less you have to learn to interact with it. Id. at 1323-25. Protester's computer expert summarized: "there would be no significant difference necessary for formal training time between the two systems, and certainly not anything approaching the magnitude of a two-day to five-day." Id. at 1325.[foot #] 11 The $29 Per Hour Figure The Air Force determined that the Joint Staff would lose productivity by having employees attend formal training and attributed a $29 value to each such lost hour of work. ----------- FOOTNOTE BEGINS --------- [foot #] 11 Respondent did not offer an independent expert in support of its formal training time assumptions, and intervenor's computer expert did not address that point. Id. at ___ 1609. Protester's accounting expert testified to his generalized "observation" that "most learning curves will fall within a very narrow band of 70 to 80 per cent" or .8 to 1. Transcript at 1113-18; 1207-11. We give no weight to his testimony on this point since he did not attempt to compare the two systems involved in any meaningful way. ----------- FOOTNOTE ENDS ----------- Transcript at 79. The program manager testified that the $29 hourly rate was derived as follows: That number I had originally obtained from the Comptroller of the Joint Staff. We had researched some budgetary figures and for, I believe it was, Fiscal Year '93, the comptroller came up with a number. We divided it by the number of people assigned on the Joint Staff, and I divided that number by 2,080, which would be the number of work hours in a year that I had remembered from a business course I had taken, and came up with $29.00 dollars an hour. Transcript at 76. The program manager further testified that since doing the CTTO analysis, he learned that the $29 per hour figure was very low because indirect costs such as taxes, medical benefits, hazardous duty pay, and pensions were left out. Transcript at 192. Protester's accounting expert testified that it would not be appropriate to use a labor rate to quantify a dollar impact in formal training unless the Government could show that the higher number of training days associated with the Grumman proposal would in fact cause the Government to incur extra labor costs or forego labor savings as a result of the training difference. Transcript at 1103-07, 1158-59, 1165-67, 1186-87, 1195-97, 1205-06, 1239-40. Protester's accounting expert believed that the effect of additional formal training time for eighty-five percent of the Joint Staff, be it hours or days, should have been zero because there was no additional cost paid by the Government. These employees simply worked overtime for no compensation. Transcript at 1161. Respondent's expert on the other hand testified that the weighted average hourly rate for all grades on the Joint Staff, civilian and military, should have been "loaded" with fringe benefits and medical benefits and ranged over the contract life from $38 in the first year to $52 in 1999. Transcript at 1870- 72. Respondent's expert obtained a current list of the number of personnel assigned to the Joint Staff, by labor, grade, and service which included civilian and military composite pay rates. Transcript at 1855-57; Intervenor's Exhibit 4. The list was published by the Department of Defense and reflected the annual cost for all military and civilian personnel. Id. at 1856-59. These compensation rates included fringe benefits other than medical benefits. Id. at 1862-63. The expert used these compensation rates and applied an adjustment for medical benefits for the military personnel and calculated the $38-$52 weighted hourly average rates. Id. at 1869-71. The Number of Users In determining the number of users, the Air Force assumed that there would be a turnover of both civilian and military personnel every three years. Id. at 83. In addition, the number of users was based upon the number of operating systems in CLIN 1000. Id. at 84. This included the 1,600 at the Pentagon and additional personnel in other locations for a total of 1,856-60. Transcript at 196-97. The MPC discount factor was not applied to the number of users because the people using the system would not necessarily equate to the number of workstations; some workstations are operated on three eight-hour shifts. Id. at 197. Respondent's expert opined that the number of users on the Joint Staff who had to be trained, i.e., 1,850, was "the most reasonable alternative for the number of users." Transcript at 1878. The solicitation, Attachment 4, contained estimated number of users with access to processing subsystems as well as an average user job mix: Minimum Maximum Average Communications Processing Subsystem 250 2,100 1,300 Office Automation Processing Subsystem 200 1,600 1,200 Data Base Processing Subsystem 0 650 400 High Speed Processing Subsystem 0 300 100 Note: Minimum should be achieved by end of contract year two, maximum numbers by end of contract. . . . . Estimated User Job Mix (by user, per day): Average active system time per user per day: 5 hours ("active" is defined as executing an application and entering/receiving data) Average User Job Mix (for 5 hours): Word Processing 1.5 hours E-mail 1.0 hours Graphics 0.5 hours Database 0.5 hours Calendar 0.1 hours Spreadsheet 0.1 hours Other Applications 1.3 hours Protest File, Exhibit 54, Attachment 2 at 6. The information in Attachment 4 of the RFP was intended to provide offerors information on how to size their proposed systems so they could handle the job mix across the Joint Staff. Transcript at 507-08. The Air Force arrived at an increased cost for the Grumman proposal by multiplying the twenty-four hours (three eight-hour days) extra training time by $29 (which represented the average hourly rate of employees of the Joint Staff) and by the number of new users to be trained, 3,420. This multiplication yielded a figure of $2,380,320. Protest File JSAN II, Exhibit 6, Exhibit 2. Formal Training: Current Users The Air Force determined that it would take one day to train current users on the Contel system and an average of five days on the Grumman system. Protest File JSAN II, Exhibit 6, Exhibit 2. This four-day difference multiplied by the same hourly rate and 1,153 users yielded an increased cost of $1,069,984 for training current users on the Grumman system. Id. Protester's expert opined that the Air Force's estimates for current users of one day formal training for Contel and five days for Grumman was inappropriate based upon several factors: the proposals provided a transition course for existing users which is two days long; users who are familiar with a system's functionality do not take as long to train as new users; none of the software in either proposal is the same as the existing system. Id. at 1346-47. According to the CTTO, the total increased cost for formal training of new and current users on the Grumman system over the Contel system was $3.5 million. Protest File JSAN II, Exhibit 6, Exhibit 2. Time to Gain Proficiency or On-the-Job Training New Users On-the-job training referred to the time it took the average user to become fully proficient on either the Grumman or Contel system. Transcript at 108, 378. The Air Force determined that it would take new users one month to become proficient on the Contel system and an average of five months to become proficient on the Grumman system, via on-the-job training. Protest File, Exhibit 6, Exhibit 2; Transcript at 378-79. This assumed five hours per day system usage in accordance with Attachment 4 of the RFP. Protest File, Exhibit 6, Exhibit 2 at 2. The Air Force did not take the formal training classes into account in developing the estimate of the number of months for on-the-job training. Transcript at 389. In considering the value of the proposals in the area of on- the-job training, the Air Force evaluators assumed that users during this period would not be operating at a full 100 percent proficiency. Transcript at 115. They, therefore, applied a 25 percent inefficiency factor. Id. at 115-16, 480. Stated differently, the Government applied a 75 percent average effectiveness rate for the entire on-the-job training period, in Contel's case, for one month, in Grumman's case, for five months. Id. at 118. The program manager had originally established a thirty percent inefficiency factor, but after consultation with the training estimator reduced this to twenty-five percent. Id. at 115. He testified: Q But between 50 and 100, you basically just guessed at the number 70, is that fair? A I wouldn't consider it a guess. I think it was a reasonable assessment that at some point after the formal training that the individuals should be more than 50 percent, and certainly less than 100 percent. You just don't go from classroom training and achieve 100 percent level of proficiency right away. Id. at 115. The Air Force's training estimator referred to the period of on-the-job training as a coming-up-to-speed period, saying "it starts out something less than 75 percent and it goes up . . . past 75 percent and by the time they get to the end of that period they are at 100 percent of their effectiveness." Transcript at 480. In calculating a dollar value for the increased time spent in on-the-job training for new users under the Grumman proposal, the Government multiplied the difference in the on-the-job training time by the $29 hourly rate, the number of new users -- 3,420 -- and the 75 percent average effectiveness, which resulted in a total of $10,909,800. Protest File JSAN II, Exhibit 6, Exhibit 2. Current Users With regard to current users, the Government concluded that it would take one half of a month to train on the Contel system, but an average of five months on the Grumman system. Protest File JSAN II, Exhibit 6, Exhibit 2; Transcript at 379-80. In explaining why on-the-job training for the Contel system was shorter, the Air Force's estimator said: For users that already have experience on using our system, when they go to the Contel system, the user interface is very similar, the way that the software operates is very similar, the word processing has similar kinds of menus, and they find the same kinds of functions, the file menu has how you open a new file, how you save a file, exiting, printing, that falls in under file, and those same locations for where those menu picks are found on the current work stations they will find them in the same place on the Contel system. So given that, my assessment was that people who had had experience will not require as much training, they just need to learn what is different about the system, but people on the Grumman system, when they go to use the Grumman system, they have to basically be trained from scratch because although the user interface, the windowing environment is similar, the applications which run in that environment are very different, and about as complex and difficult to learn as the ones currently running on the Wang, but no where near similar to what we have on the Wang. Transcript at 382-83. Multiplying the difference in on-the-job training time by the $29 hourly rate and by the number of current users, 1,153, and by the 75 percent efficiency factor resulted in a savings of $4,137,829 attributable to the Contel proposal. Protest File, JSAN II, Exhibit 6, Exhibit 2. Protester's expert in human computer interaction found it "ridiculously hard to understand a difference of that order of magnitude -- 500 percent -- between two systems of the same functionality based solely on some small differences in the user interface. . . ." Transcript at 1348. In his experience, differences between full software systems are extreme if they are two to one; here the difference is five to one and ten to one. Id. Protester's expert also characterized the Air Force's methodology for measuring the on-the-job training differences between the two systems as "gut feelings" and opined that this methodology was unreliable. Transcript at 1349- 50.[foot #] 12 In the expert's view, the fact that the average time spent by all Joint Staff users using the system was five hours a day would cast further doubt on the assumption it would take a user five months to become proficient on the Grumman system. Id. at 1353. This expert found the seventy-five percent effectiveness assumption to be totally unreasonable because the figure was speculative and was inappropriately applied as a constant rate across the time difference. Transcript at 1369; see, Id. at 1356-62; Protester's Exhibits 19, 20. Intervenor's computer expert neither validated nor disputed the seventy-five percent productivity rate, other than to say that certain curves when averaged would yield a seventy-five percent average. Transcript at 1592-94; 1614-25. ----------- FOOTNOTE BEGINS --------- [foot #] 12 The expert admitted that on-the-job training is difficult to measure and said that the way it is normally done is an empirical study, giving people tasks using the system and measuring their productivity from the beginning to the point where their productivity starts to "flatten out." Id. at 1350- ___ 51. ----------- FOOTNOTE ENDS ----------- With regard to on-the-job training estimates for current users, protester's expert said that the one-half month to learn the Contel system versus the five months to gain proficiency on Grumman was a ridiculous number because the two new systems shared a lot of the same software and had totally different software from JIMS. Id. at 1373. Cost for Classes Using the previously established number of days of formal training for new and current users on the Grumman and the Contel systems, the Air Force calculated that the total cost of classes would be $330,248 more for Grumman. Protest File, Exhibit 6, Exhibit 2. These costs were based on the offerors' prices for CLINs 7010, 7110, and 7160, i.e., the contractor-provided courses for "User Overview," "Network Overview," and "Transition to JSAN." Id.; Protest File, Exhibit 54 at 135-36; Transcript at 121. Contel and Grumman had both bid a total of seven days for these three courses, and Contel's cost per day was $219, while Grumman's was $91. Id. In the CTTO analysis, the total additional cost for the Grumman proposal attributable to formal training, on-the-job training, and classes was $18,828,121. Protest File JSAN II, Exhibit 6, Exhibit 2. The CTTO described the overall impact of the increased training as follows: Training and lost productivity are significant factors because they affect the Joint Staff user population of 1627 Action Officers, most of whom do not specialize in using automated systems. This was a major factor in establishing user friendliness as the most important evaluation item. Additional losses to productivity and effectiveness occur when an action officer is unwilling or unable to attend extended training away from the office. Many officers, especially those in senior positions, will not attend a week of training, reducing both their ability to use the system and the effectiveness of the Joint Staff as a whole. Other officers would have to perform the tasks not done by the ones who are unable to effectively use the system. Protest File JSAN II, Exhibit 6 at 6-7. Nonquantifiable Discriminators in the Area of User Friendliness The Air Force identified five discriminators in the area of user friendliness which it could not quantify, all relating to long-term use: - Ease of system use/learning - Consistent and logical menu layout - System intuitiveness - GUI (graphical user interface)/WYSIWYG (what you see is what you get) - Help sensitive assistance Protest File JSAN II, Exhibit 6, Exhibit 4. In summarizing the nonquantifiable benefits of the Contel system's user friendliness, the Air Force in its written CTTO analysis stated: The user friendliness of the Contel system would also have a positive impact on user productivity over the long-term after the initial training and familiarization periods are over. The intuitiveness, ease of use, context-sensitive "Help", and consistency of Contel's application software would provide continued savings of user time and a continued higher quality output of work products (e.g., briefings, decision papers, analyses, etc.). Protest File JSAN II, Exhibit 6 at 2. The Air Force continued in the Offeror Comparison: Contel's proposal offered the best technical solution for user friendliness. The majority of the software was designed to make full use of the proposed X-windows Motif user interface. Contel's primary user applications were all proposed to operate in accordance with the Motif style guide. In contrast, Grumman's and Syscon's proposals offered marginal technical solutions for user friendliness. The majority of their software was not designed to make full use of the proposed X- windows Motif user interface. Users of Grumman's . . . system would require substantial training and long-term assistance. Usability of the system as a whole will be significantly reduced, since many applications use unique menus and protocols, while Contel's system will be more usable and user friendly, since most applications have the same menus and window accoutrements. Id. at 6. The Air Force's Assistant Chief, JMPDM, testified that although he could not quantify the long-term differences in user friendliness, he felt that users of the Contel system would be more productive over the long term because Contel's system was easier to use. Transcript at 448. He testified: Contel users were going to be more productive over the long-term because the software was generally easier to use, and when they had to do something, they wouldn't spend as much time figuring out how to do, particularly an unfamiliar task, which comes up quite regularly. It is not the same familiar task all the time, but we have unfamiliar tasks that come through all the time that we have to figure out how to do, and make it look like this. It is much easier for the Contel user to figure that out than the Grumman user. That means that the Contel user is not going to waste as much time figuring out how to do something, and is going to be spending that time instead doing that or some other task. That translates to me as a long-term productivity gain, but I did not know how to quantify that gain. Transcript at 449. He further explained how user friendliness impacts the action officer using the JSAN system: Because the action officer will be changing things more, reorganizing things, some of those functions that a secretary might not use very often, are going to be needed more frequently, or least more urgently by the action officer at the Joint Staff. They may not need it very often, but when they need it, they need it right now, and they need to be able to figure out how to use it. . . . . Hence, the intuitiveness function -- aspect. Yes, they may have been trained how to use something and it may have been six months to a year since they first saw that. Because they have seen it before, they know that it's doable. But they don't know how to do it necessarily. They only know that it can be done. That's where intuitiveness plays a big factor because the action officer then, in an intuitive package, can say, yes, I know how to do that, I'm changing the way this text looks, so I'm going to look up under text, and, yes, there it is, or not. I mean, that's the difference. Transcript at 445-46. He cited Contel's applications, including its Asterix office automation, which were designed to operate in a way which maximizes use of GUI technology and direct manipulation.[foot #] 13 Transcript at 435. Grumman's primary office automation suite of software, while displaying an interface at the desktop level which permits some direct manipulation does not carry those attributes down below the initial interface level into the applications. Transcript at 459-60. Intervenor's expert testified: The basic style of interaction that you see in the Uniplex package reminds me of the kinds of packages that I saw designed around like, say 1985, '86, '87 time frame where certain kinds of menu-based keyboard oriented types of interactions were dominating, and then marginal improvements are seen being made . . . . One of the characteristics of these kinds of user interfaces often is that they, pardon the expression, but interact the hell out of the user every now and then in the sense that they solicit a lot of detailed information, and you end up . . . as a user, supplying that information one piece at a time, and that can become very cumbersome. . . . Asterix, on the other hand, reminds me of the user interactions of much more recent vintage where the concept of direct manipulation has been used in conjunction with the use of the keyboard and other kinds of user interface types of things may be around, like function keys and such, to try to make a system which is a lot more responsive to the kinds of things that the user might be doing. Transcript at 1528-29. Protester's computer expert stated that direct manipulation "is not a major feature of Uniplex." Transcript at 1300. He further testified: Q [I]n terms of GUIness, what is different about Asterix when compared to Uniplex? A The main difference is in the area that some of the operations have direct manipulation characteristics in Asterix that aren't there in Uniplex. The example on cutting and moving as opposed to cutting and pasting is one. The Asterix makes more use of iconic and graphical interaction in things like setting margins, ----------- FOOTNOTE BEGINS --------- [foot #] 13 "Direct manipulation" means that a pointing device such as a mouse can be used to manipulate what is on the screen and to implement certain functions, rather than typing in instructions. Transcript at 456, 1299-1300, 1426-27. ----------- FOOTNOTE ENDS ----------- and a variety of activities like that, so in those cases, it is a bit more GUI, I would say. In Uniplex you can do part of the margin setting with the mouse, but you need to use the keyboard to specify what kind of margin you are setting, whether it is a left or a right or a tab, for example and in Asterix I don't believe you have to do that. Transcript at 1306-07. Contel's Asterix is fully WYSIWYG in that what a user sees on the screen is identical to the way the document will look when it gets printed out, and that the document can be edited in WYSIWYG form; Uniplex is not fully WYSIWYG. Transcript at 1307- 09. Grumman's expert witness on computer human interaction summarized: Q . . . As I understand your testimony, Sir, you have agreed that Asterix is more direct manipulation than Uniplex. That Asterix is more GUI than Uniplex. That Asterix is more WYSIWYG than Uniplex, and that Asterix has a better help system than Uniplex. Is that fair? A Yes. I have agreed with all of this. It just was a matter of degree. Q And that is really the basis for your distinction is really, it is not that they are different, but you are disagreement [sic] is with how much they are different, is that fair? A Yes. It is not 500 to 1000 percent difference. Transcript at 1469. In performing the CTTO analysis, the Air Force did not take into account the fact that there would be a new version of Uniplex and a new version of Asterix available during contract performance. Transcript at 203. Grumman's JSAN Program Director testified that although Grumman had proposed an upgraded version of Uniplex Plus 7, the Government would actually be receiving the next release of Uniplex, known as version 8 or Medley. Transcript at 897. Medley was announced in March 1992, and Grumman has knowledge of the functionality of the product. Id. at 900. The product will be available at the end of 1992. Id. at 989-90. However, portions of the Medley product are still in beta testing. Id. at 990. It is Grumman's position that the updated version, Medley, should have been taken into consideration in the CTTO. Id. at 991. Systems Architecture: Quantifiable Discriminators Workstation Upgrade The Air Force identified two quantifiable discriminators in the area of systems architecture: workstations and system administration requirements. The CTTO pointed out the following benefit associated with Contel's proposed workstations: Contel's proposed workstations which are RISC based and binary compatible with the corporate processors, will reduce training requirements and increase system maintainability. These workstations are significantly more powerful than the Intel 80486 based devices proposed by the other offerors and should meet the Joint Staff's needs for a longer time period. Grumman exceeded the MS-DOS performance test but failed to meet the standalone capability standard. Protest File JSAN II, Exhibit 6 at 15. In quantifying the workstation discriminator, the Air Force assumed that the Grumman workstations would have to be replaced at their current cost in year three of the contract, while the originally proposed Contel machines which would then be priced far lower would continue to be ordered. Id. at Exhibit 3. The determination as to whether proposals complied with the SPECmark requirements had been a matter of the validation phase of the evaluation; this was not considered in rating proposals because there was no standard against which the SPECmark was to be directly evaluated. Transcript at 125-26; 595.[foot #] 14 ----------- FOOTNOTE BEGINS --------- [foot #] 14 The officer involved in quantifying the workstation upgrade testified: Q [D]id any of the workstation standards cover [SpecMarks] indirectly? A I would say, particularly the one on the standalone capabilities, the entire point of this was to get a powerful workstation that all of the software applications could reside on. What we currently have is a system where we've got the server handling a lot of these applications, and that -- the trend in the industry and in the Government is power to the workstation. So, the more -- that's why we had in there, we wanted more on the workstation to have a more powerful workstation, rather than a more powerful server. So, (continued...) ----------- FOOTNOTE ENDS ----------- Although it was not the subject of technical evaluation and was not expressly mentioned in the SSEB Report, the program manager testified that SPECmark performance had been a concern at the initial SSAC briefing.[foot #] 15 Transcript at 128. The Air Force's assumption that the Grumman workstations would have to be replaced in year three was based in part on a historic growth trend and a predicted increase in requirements as reflected in the CTTO analysis: Historically, the Joint Staff's information system performance requirements increased twice from 1985 to 1990; first from Wang "green screen" dumb terminals to Wang 286 based personal computers (PCs) and second, from the 286 based PCs to Wang 386 based PCs. Expecting a similar growth trend in the future as well as additional performance requirement increases on unknown magnitude due to the introduction of a POSIX and GOSIP compliant system, a B2 multi-level secure operating system, and implementation of an advanced graphical user interface operating capability, the Joint Staff expects performance requirements to increase at least once from 1990 to 1995. Protest File JSAN II, Exhibit 6 at 2; see Transcript at 131. ----------- FOOTNOTE BEGINS --------- [foot #] 14 (...continued) the whole idea behind that, having more and more applications on the workstation, was to have a more and more powerful workstation to handle that load. Transcript at 595-96. [foot #] 15 The program manager testified that the fact that Grumman had proposed the 486 processor was in the SSEB report and that it "was a discussed item." Transcript at 129. He stated: At the SSAC briefing in November, 1991, and I was in attendance, the technical area chief, Captain Dickinson, briefed the workstations that were offered by all the offerors, and it was a concern at that point in time that the Grumman workstation would essentially be operating in a maxed out capacity, where the RISC 6000 model would not be affected as much with the memory stress, nor with impact on speed. So it was a concern all the way back even into the SSEB evaluation presented to the SSAC in November of '91. Id. at 128. ___ ----------- FOOTNOTE ENDS ----------- The officer who took the lead in quantifying the workstation and system administration discriminators had been a member of the SSEB and was recently an action officer in the Information Technology Division of the Joint Staff where he was responsible for data management and database management policy and operations. Transcript at 525, 545, 548. He has a Bachelor of Science Degree in Physics and a Master of Science Degree in Computer Science. Id. at 543. After identifying the difference in performance levels between the Grumman and Contel systems, this officer attempted to ascertain the impact the less powerful Grumman workstations would have on the Joint Staff and concluded that the Grumman workstations would have to be upgraded in year three, the year the Air Force expected the "B2 operating system to hit and a lot of unknown processing requirements would be placed on the workstations." Transcript at 551. In accordance with the RFP's schedule, he determined to keep the workstations which had been procured in years one and two and to begin acquiring new, more powerful workstations in year three. He priced them at the year one contract price, explaining: As for the pricing, I considered that a state of the market system that we had procured at the beginning of the contract, that a state of the market machine is going to cost generally the same thing in '95. It'll be more powerful but it'll probably cost about the same amount . . . especially since we would be most probably purchasing it noncompetitively from . . . Grumman itself. Transcript at 551-52. Grumman's JSAN program director testified to a variety of methods of upgrading the Grumman workstations which did not necessitate a total replacement and which would have been accomplished at lower cost. He said that Grumman's symmetrical multi-processors could have handled a second CPU board which would have doubled the SPECmark rating; the Air Force could have added the forthcoming Intel 586 or P5 chip or a "clock doubling" chip. Transcript at 889-90, 892-94. After reviewing sections of the RFP which "mentioned the expandability of systems being important," i.e., Paragraphs C.3.1, C.4.5.l, C.5.7, H.42, and C.10, the officer who took the lead on the workstation upgrade testified: Q [W]hat information was there described in a general way, about the technical requirements, delayed deliverables and expansion capability that would advise an offeror that there may be a need for a[n] upgrade at some point? A Well, the ones we cited right here, showing that throughout the RFP we had talked about expandability, capability improvement of the systems, and the actual improvement clause itself which is, I guess, somewhat frowned upon at times by the procurement community, because it results in non-competitive procurements later on during the life of the contract, or engineering change proposals. But we saw this as necessary and were very insistent on putting it in. Transcript at 571. The total additional cost of the Grumman proposal due to the workstation upgrade was calculated to be $7,209,259. Protest File JSAN II, Exhibit 6, Exhibit 3. Protester's accounting expert faulted the Air Force's conclusion regarding the workstation upgrade due to what he termed a "fundamental underpinning," i.e., its "uncritical use of prior historical trends," claiming that the "military, political and economic conditions which prevailed in the 1980's are not the same conditions that exist today," citing a "huge military buildup in the 80's." Id. at 1121-22. This expert also opined that the prices for the upgrades, which were the same as first- year prices, were overstated because they assumed a sole source situation and ignored the competitive marketplace and the downward trend in prices in the microcomputer industry. Id. at 1126-27. Respondent's expert prepared an alternate calculation associated with Grumman's productivity loss due to its processors assuming that the Air Force would use the initial workstation throughout the duration of the contract. Transcript at 1882-88, 1961-64; Intervenor's Exhibit 11 at 2. Adopting the analysis of intervenor's expert that there would be a ten percent productivity loss due to the slower processor speed of the Grumman equipment, respondent's expert determined this would be a thirty-minute loss of time per day on the Grumman system. Id. at 1994. He therefore multiplied this ten percent of the five hours a day usage time times the hourly rate for each day of each month of the contract. Id. This did not take into consideration a workstation upgrade but rather assumed that the slower workstations would be used for the life of the contract. Id. at 1888. Depending on the hourly rate and number of users, this analysis yielded an increased dollar value to the Contel system ranging from $14,470,457 to $35,748,400. Intervenor's Exhibit 11 at 2; see Transcript at 1994-95, 2028-30. Of course, if one assumed that there would in fact be a workstation upgrade, these numbers would be reduced. Protester's Exhibit 45; Transcript at 1956-70, 2031. Respondent's expert acknowledged that if one assumed the upgrade occurred it would be fair to not impute any processor speed advantage to Contel after the date of the upgrade. Id. at 1956. System Administration Time Both Grumman and Contel had failed the standard for system administration time, which was one to two hours per week for a twenty-user community of interest. Contel had proposed five to eight hours per week. The Air Force in its original evaluation had interpreted the Grumman proposal as offering a twenty-manhour per week system administration time. However, during the hearing in JSAN I, Grumman personnel testified that Grumman had intended the twenty-hour per week number to refer to system time and not manhours. The Board found: Grumman's engineering manager for JSAN identified three functions of system administration requiring human intervention and assigned a total number of manhours per week to perform each such function for a twenty user COI as follows: optimizing file space - 2.3 hours, software updates - 4 hours, security administration (unrelated to reviewing security related reports) - 3.5 hours. Transcript at 719-26. This would indicate that Grumman's manhours per week for system administration should have been 9.8 hours. JSAN I, GSBCA 11635-P, slip op. at 23-24 (Mar. 19, 1992). The Board evaluated this testimony in a footnote: This witness' testimony on this point, however, was inconsistent and confused. Transcript at 719-26, 752-75. Apparently, at his deposition he had testified that the manhours per week should have been 6.1; at the hearing he said the 6.1 meant machine hours, then said that this too was an error. Id. at 753-60. What the witness did seem certain of was that the 20 hours "should have been machine hours for a 178-user community." Id. at 756. JSAN I, GSBCA 11635-P, slip op. at 24 n.9 (Mar. 19, 1992). In the CTTO, the Air Force did not change its original conclusion that Grumman had proposed a twenty-hour per week system administration time. Based on its assumption that the Grumman system would take twelve-fifteen hours more than Contel's to administer each twenty-user workstation group, and using a cost of $29 per hour, the Air Force estimated the following increased costs associated with Grumman's proposal: Additional Cost of Grumman's Proposal (Calculation: Number of systems/20 X (12 or 15 hrs/wk) X 52 wks X $29) 12 hrs/week 1 5 hrs/week Contract Year 1 213 Systems $ 193K $ 241K Contract Year 2 521 $ 471K $ 589K Contract Year 3 1000 $ 905K $1131K Contract Year 4 1493 $1351K $1689K Contract Year 5 1630 $1475K $1844K Contract Year 6 1741 $1575K $1969K Contract Year 7 1811 $1639K $2048K Contract Year 8 1851 $1675K $2093K Total $9,284K $11,604K Contel's lesser systems administration requirement is an estimated $9.3 to $11.6 million value to the Government. Protest File JSAN II, Exhibit 6 at 15. The officer who quantified this discriminator reviewed the descriptions of how the system administration tasks would be performed in the two proposals. Transcript at 530. He concluded that as each workstation is added to the system, system administration time would increase linearly, finding this consistent with Grumman's proposal on the administration time required to accomplish weekly full disk backups. Transcript at 587-89. Grumman's JSAN program director, however, testified that Grumman's system administration was highly automated such that adding twenty workstations would not require twenty additional manhours because many tasks occur simultaneously with little to no additional human intervention. Transcript at 929-30. Grumman's computer expert opined that the Air Force's linear extrapolation was "completely inappropriate." Transcript at 1383-86. In its written cost-technical tradeoff analysis, the Air Force acknowledged that Grumman claimed that the twenty hours was not manhours but system time, stating in a footnote: However, this was not stated anywhere in Grumman's proposal. The GSBCA described Grumman's testimony on the subject of how many manhours would actually be required as "confusing." While the actual cost to the government might differ, cost estimates in his analysis rely upon the data contained in Grumman's and Contel's proposals. Protest File JSAN II, Exhibit 6 at 16. Nonquantifiable Discriminators in System Architecture The Air Force identified three nonquantifiable discriminators under the item system architecture: (1) minimizing points of failure, (2) throughput and corporate processor, and (3) architecture manageability. Minimizing Points of Failure In describing the difference between the proposals on points of failure, the CTTO analysis stated: Grumman's proposed system architecture contained several single points of failure which could isolate large numbers of users (entire 300-person divisions in some cases) from access to central data stores and other Joint Staff users. Contel's proposed system is designed to provide redundant paths to central data stores and to limit the number of users affected by single points of failure (worst case loss of 192 members). Other single points of failure in Grumman's proposed system were the solitary JIMS-to-JSAN connection and the XTS-200 secure guard after B2 security conversion the failure of which could isolate 300-800 Joint Staff users from the network. Contel proposed redundant solutions to avert system failure points. The cost impact for this element could not be quantified. Protest File JSAN II, Exhibit 6 at 3. The item chief for system architecture elaborated: [The discriminator minimizing single points of failure] also considered redundant connections, specifically the JIMS to JSAN connection, which would be critical during the transition time period. Grumman was only providing one link, which if it failed would separate the JIMS community from the JSAN community. Another factor which as far as points of failure, which was identified to me by the security people, was the use of the XTS-200 security device which they were going to leave in the network after really it wasn't needed. Once B-2 was installed on the network, that device would no longer be necessary, yet they were leaving it in the network and it was a potential single point of failure that would isolate the SCI community on the network. Transcript at 714-15. Corporate Processor Throughput The Air Force's program manager described a concern with Grumman's proposal in the corporate processor/throughput area: The solution provided by Grumman for the corporate processor identified that there would be additional uses of that particular platform for access of the database system, office automation system, as well as access to corporate storage capabilities which we identified as a potential for a bottleneck, which is related to a throughput. Transcript at 216. The item chief for system architecture specified: Q And could you explain for the Board what [your] concern [about Grumman's corporate or communications processor] was? A The concern was that the processor, as configured, I believe it was an AT&T 7040 processor was capable of being configured with four CPUs. Grumman had chosen to configure it with only two CPUs, and a larger concern was that the communications processor was performing more functions than just communications processing. It also had some data base functions, and I believe it had some office automation functions running on it as well. Transcript at 676-77. When asked whether the addition of two more CPUs would solve this throughput problem, he testified: It would eliminate part of the concern, but another part of that concern was that the communications processor was configured to run more applications than just communications processing. The corporate storage was connected to the communications processor, and there were also applications running on the communications processor outside of communications processing. Some database functions were on it and I believe some office automation functions were also on that processor. Id. at 714. Architecture Manageability The Air Force's program manager defined architecture manageability as: That's the amount of program manager, program office effort that would be involved in managing a multi- hardware, multi-software system. The types of responsibilities that fall under life cycle management, for example, systems upgrades, configuration management, security systems administration, security systems accreditation and general management of one system over the other, especially taking into consideration that Grumman provided a high risk solution to satisfy the JSAN requirements. It also takes into consideration, training that system administrators would have to go through to learn the multiple hardware and software systems . . . . Transcript at 217. The Air Force's concern about Grumman's architecture manageability stemmed from Grumman's multiple systems; the CTTO analysis states: Multiple systems necessitate additional training for Action Offices and system administrators to operate the multiple systems. Risk is further increased when installing upgrades because the upgrades must operate with existing systems in order to properly exchange data with other on-line systems hardware and software applications. Protest File JSAN II, Exhibit 6 at 3. In addition under the Security Item, the CTTO continued: Counter-balancing Contel's schedule risk are the technical simplicity and straight forwardness of their solution which employs a single product family line as opposed to Grumman's multiple product family lines. This single product versus multiple product line consideration is above and beyond that previously discussed under System Architecture. Once implemented, Contel's solution would provide greater reliability, reduced systems administration burden, and less of a potential information flow bottleneck. Id. at 4. The Quantifiable Discriminator in the Transition Factor: User Support Team The CTTO analysis included a factor which had not been expressly stated in the solicitation with the following explanation: A factor not specifically addressed under the Transition Item, but critical to the success of the JIMS to JSAN transition, were the User Support Teams proposed by Grumman and Contel. Both Offerors proposed teams of skilled technicians to provide technical and operational support to the Joint Staff. However, the User Support Team proposed by Contel is more extensive than that proposed by Grumman. This larger team will ensure better, more timely service to Joint Staff personnel, allowing for a smoother transition with less disruption to the Joint Staff mission and should result in an estimated savings of $5.8 million. The Contel proposed User Support Team provides technical support savings for the joint Staff. Currently, the Joint Staff provides technical support personnel to each Community of Interest using either a military assignee or contracted-out services. Because of expanded technical assistance workload, the Joint Staff supplements the military with twelve (12) contract support personnel. Given the additional available resources of skilled personnel offered by Contel and available for technical duties, the Joint Staff could eliminate 12 contracted-out positions, thus reducing these costs. Currently, the Joint Staff pays an average of $60,000 (in 1992 dollars) per year per person for these services. By using Contel's systems personnel to support JSAN, the Joint Staff could save as much as $5.8 million in contracted-out service charges (12 people X 8 year contract X $60,000/year = $5,760,000). Protest File JSAN II, Exhibit 6 at 22-23. Contel included an extensive discussion of its user support team in its proposal. Contel's price for one month of its user support team was $115,141, which the Air Force interpreted to mean approximately sixteen-twenty people. Transcript at 271-72. Grumman's price for one month of user support was $6,700 per month, which the Air Force program manager interpreted to mean a small user support team, approximately a two-person help desk. Transcript at 161-62, 318, 948. Grumman's price would barely cover two data entry clerks. Transcript at 324. Grumman's program manager testified that Grumman had only proposed a price for one "man month" of user support as opposed to a team month -- even though the solicitation on its face solicits price for a team. Transcript at 877-79, 948-49. Neither offer specified the number of persons who would be supplied under the user support CLIN. Grumman's proposal stated: "In addition to the help desk and maintenance function, the user support team provides system administration services such as backups, restorations, and archiving the Government-designated corporate JSAN components. . . ." Protest File JSAN II, Exhibit 35, Book II at I-H-40; Transcript at 161. Grumman's description of its user support in its proposal was not as extensive as Contel's. Id. Transition: Nonquantifiable Discriminators The CTTO analysis also determined nonquantifiable benefits to the Contel proposal in two other areas of transition: minimum user impact and conversion of JIMS files at transition. Protest File JSAN II, Exhibit 6, Exhibit 4. The CTTO analysis also recognized benefits of the Grumman proposal in security, maintenance, and management. Protest File JSAN II, Exhibit 6 at 2. The SSA's decision provided: My analysis of the information presented to me also included the strengths of Grumman's proposal, which were in the three least important technical areas. Grumman's B2 security solution has a somewhat higher probability of timely delivery than that proposed by Contel. Grumman's method of converting database applications on the database processor is less disruptive than that proposed by Contel. Greater experience with Wang systems and with large LAN and security automated information systems demonstrated by Grumman earned the proposal a higher rating in the areas of maintenance and management. However, some of these strengths are partially offset by the realities of JSAN's implementation schedule. Specifically, for security, Grumman, like Contel, does not yet have a B2 rated system and has only certified that B2 level will be delivered on schedule. Moreover, Grumman offered multiple hardware and software products which increases implementation risk. For maintenance and management, Grumman's advantage results from its choice of Wang as a subcontractor. This advantage diminishes as JSAN systems are installed and as the current Wang systems are replaced. The Wang advantage ceases in JSAN year 4 or 5 when JSAN installations are completed. This is reflected in the proposal evaluations. While these Grumman strengths are appealing, they do not outweigh the value represented by the greater overall technical strength of the Contel proposal. Id. The SSA's Decision On June 29, 1992, the SSA received a two and one-half hour briefing at which the Air Force's CTTO analysis was described. Transcript at 826-27. The SSA analyzed every aspect of the CTTO report satisfying himself that the nonquantifiable and quantifiable elements had been fully considered before signing the decision on July 13, 1992. Protest File JSAN II, Exhibit 6; Transcript at 828. The SSA's decision provided: "the nonquantifiable discriminators identified in the [tradeoff] analysis clearly establish Contel's as the superior proposal." Protest File JSAN II, Exhibit 6 at 1; Transcript at 805. The SSA testified that the nonquantified discriminators standing alone identify Contel as not just the superior proposal, but also the best value to the Government. Transcript at 805, 809, 832. In pertinent part, the SSA elaborated: The non-quantifiables turned out to be of paramount importance because they were so directly focused on the user-friendliness and system architecture areas. . . . . If you don't have that, the easy training, the user- friendliness, the long term benefits of that, because of the degree of dependency upon that system, which is almost total in many cases. . . . you're not going to get a job done within time constraints, perhaps, or with the completeness you otherwise might. . . . . In the system architecture area we're talking about single points of failure. . . . We're talking about the workstation power. . . . We're really doing a productivity analysis here. If you want to boil this whole thing down . . . that's why I stressed the non- quantifiables. . . . Transcript at 832-33. The SSA was aware of Grumman's overall High Risk rating. Protest File JSAN II, Exhibit 6; Transcript at 839-40. He testified: "Cost risk, again, came to play most importantly in the top two priority items, user friendliness and system architecture." Transcript at 842. In his view when the risk factor is considered, this effectively reduces the presumed $33.9 million delta difference in prices to a "paper difference." That is, when risk is added into the equation, "that number starts moving," such that the actual or real difference might not be anywhere near the facial difference of $33.9 million. Transcript at 842-43. The SSA testified that he was aware that discounting the quantified discriminator would have left a dollar delta in Grumman's favor. Transcript at 845. He explained: Q Taking the worst case number that drops out of either one of those analyses -- i.e., $15 to $16 million dollars -- if you [had] known that the quantified discriminators didn't eliminate all of the $33 million dollar difference favoring Grumman, and knowing now that there would be, under this particular analysis, a $15 million dollar unaccounted for delta, if you will, would that fact have changed your decision and, if not, why not, or if so, why? A It would not have changed my decision. I can certainly answer that. Again, we go back to the impact of the non-quantifiables. Let me just stop there. No matter how you cut this pie, and what assumptions you use, I have looked at the worst case, and the most favorable case toward Contel's, and regardless the answer still comes up award to Contel because of the tremendous impact of the non- quantifiables. Id. at 846. The SSA summarized: I was more than satisfied that it did in fact validate our initial decision, and when you marry the results of the cost/technical tradeoff with our original assessment of the clearly superior proposal by Contel, and associate that with the high-risk assessment of the Grumman proposal, which is a serious consideration, obviously, for the life of the contract, I was convinced at that time that eight years from now, eight years after award and look back, I think we will find that not only is the Contel solution technically superior, but it was the most cost-effective solution. Transcript at 839-40. In explaining why in his view the nonquantifiables constituted value, the SSA testified: [T]hat's where I really took a good long look at the term of the contract, the risk involved, the -- almost what I would see as an immediate requirement to make substantive changes if the Grumman system was installed to make it more user friendly right off the bat. I really saw this as a situation where a clearly superior technical proposal with a moderate risk assessment was up against a very low cost and highly risk tasked counterproposal, and over the term of the contract, productivity measurements, impact on the day- to-day operation of that system, downstream upgrade costs and the tremendous training tag associated with the non-user friendly system, adding all those together . . . it came to a very clear decision in favor of Contel. Id. at 865-66. The Expert Opinions Following the filing of this protest, experts hired by protester, respondent and intervenor analyzed the CTTO decision. Intervenor's expert witness in system design (including software design, user interface design, and systems architecture) testified that, due to the difference in response times, it was his judgment that users of the Contel JSAN system would be ten percent more productive than users of the Grumman JSAN system. Transcript at 1565-66. Based on his observation of the architectures of the systems, his personal experience working with the machines and personal knowledge of the systems, intervenor's expert witness estimated that the Contel system would yield an average response time of about half a second or less and the average response time on Grumman's system would be around a second or higher. Transcript at 1565-66.[foot #] 16 Intervenor's expert further estimated that users of the Contel system would be five percent more productive than users of the Grumman system due to differences in the user interfaces (direct manipulation versus graphical user interface) offered by the systems. Transcript at 1547-51; 1595-97. In forming his opinion, intervenor's expert relied on his observation of Asterix versus Uniplex, the technical manuals of each system, the MOTIF compliance of each system, and two articles which addressed the quantification of different kinds of user interfaces using an empirical methodology. Transcript at 1522-24; 1533-34; 1537- 38.[foot #] 17 ----------- FOOTNOTE BEGINS --------- [foot #] 16 According to intervenor's expert witness, response time is the time elapsed from the moment the user, for example, strikes a key to seeing the completion of the function or operation on the screen. Transcript at 1654. [foot #] 17 The first article, entitled "Task Orientation and User-Oriented Dialog Design," compared direct manipulation ("mouse") to menu-driven interface. Intervenor's Exhibit 6. The (continued...) ----------- FOOTNOTE ENDS ----------- Intervenor's expert witness admitted that the findings in the two articles could not be directly translated to JSAN because of the complexity of that environment. Transcript at 1541. However, intervenor's expert stated that in his judgment, based on the two studies: when we try to assess the productivity difference that may come about in the environment that JSAN has that we would have to reduce the numbers fairly substantially to come up with a fairly conservative estimate, and my estimate is that if I was to take a number like five percent difference that that will account for a whole lot of differences that may be there between these studies and the JSAN environment. Transcript at 1547. Intervenor's expert found that the results of the LTD were consistent with his expectations regarding system response times. Transcript at 1564. Intervenor's expert witness conveyed his opinion regarding the five and ten percent productivity variable between the two systems to respondent's expert. Transcript at 1567. Calculations Using Productivity Percentages Respondent's expert also testified regarding the effects intervenor's expert's productivity analyses had on the Government's estimates (with and without MPC and PV discounts) and prepared four alternative calculation methods. Transcript at 1879-87. Based on the five and ten percent productivity differentials offered by intervenor's expert and assuming that the Grumman system would not be upgraded, respondent's expert witness calculated, in dollar amounts, the cost attributable to lost productivity associated with the Grumman system for the entire contract period. Intervenor's Exhibit 11 at 2; Transcript at 1887-88, 1991-94. Respondent's expert prepared a model demonstrating four alternative methods of calculating the quantified discriminators. The alternatives varied, in some cases, by using different numbers of annual work-hours to figure hourly rates and by inputing different numbers of users into the calculations. Transcript at 1869-78, 2034-36; Intervenor's Exhibit 11 at 1. ----------- FOOTNOTE BEGINS --------- [foot #] 17 (...continued) second article, entitled "The Benefits of the Graphical User Interface," was a study to identify differences in the performance of graphical and character-based user interfaces for applications software. Intervenor's Exhibit 5. ----------- FOOTNOTE ENDS ----------- Discussion Did the Air Force Improperly Give "Evaluation Credit" in Areas Where Offerors Failed to Meet or Exceed Requirements? Grumman contends that the CTTO was fatally flawed because the Air Force considered the comparative advantages of offers in areas where neither offeror had received evaluation credit during the technical evaluation/rating phase of the procurement. Grumman contends that this methodology contravened the following express language of Section M.3.3.1 of the RFP: Offered components, features and capabilities beyond the minimum mandatory requirements specified in the RFP, Section C references below [,] will be evaluated depending upon the degree to which they benefit the Government. To receive evaluation credit the offered additional component, feature and/or capability beyond the minimum requirements must substantially benefit the Government as determined by its contribution to the specific items, factors and considerations identified below. There followed the list of the six technical items within which were factors and considerations. There is no dispute that the Air Force in the CTTO did not strictly limit the comparison of offers to items, factors, and considerations which were given "evaluation credit" during the technical evaluation. Rather, areas where both offerors had failed a standard and received no evaluation credit during the initial technical evaluation were weighed and assigned value in the CTTO. Further, areas which had not been the subject of evaluation against the standards, but simply encompassed minimum mandatory requirements, like SPECmark performance, were also weighed in the CTTO. Section M.3.3.1 read in isolation is open to the interpretation Grumman espouses, i.e., that this section precluded the Air Force during the CTTO from giving evaluation credit to any item, feature, or capability which had not received credit during the initial technical evaluation. The problem with Grumman's argument is that Section M.2 "Basis for Award" informed offerors that selection would not only be based upon how well the offers satisfied the technical evaluation criteria set forth in Section M.3 but "also on an integrated assessment of the proposals submitted in response to this RFP." Protest File, Exhibit 54 at 318 (emphasis added). We deem this Section's latter language to permit a head-to-head comparison of proposals in an integrated fashion without regard to whether the proposal had been given evaluation credit for a particular feature. To interpret the RFP in the fashion Grumman suggests would ignore this clear language of Section M.2[foot #] 18 We recognize that the agency afforded itself wide discretion in the selection provision, especially when coupled with the fact that no weights were assigned to the various technical items and vendors were not told -- and the agency itself never pinned down -- how much more important technical was than cost. We note, however, that the time to protest provisions of a solicitation is before the due date for proposals. We, thus, deny this ground of protest. DALFI Inc., GSBCA 8975-P, 87-3 BCA 20,018 at 101, 356, 1987 BPD 131, at 2, recon. denied, 87-3 BCA 20,070, 1987 BPD 141. Was the Air Force's Cost-Technical Tradeoff Analysis Reasonable? In JSAN I we sustained the protest because the record contained no explanation of why the superior technical proposal of Contel justified paying a $33.9 million premium for that system. We thus directed the Air Force to perform such an analysis and articulate and document its decision. Our task in reviewing this analysis is not to substitute our judgment or the judgment of expert witnesses for that of the agency officials. The legal standard against which we review best value determinations is not one of perfection or even accuracy, but rather one of reasonableness. Lockheed Missiles & Space Co. and International Business Machines v. Department of the Treasury and AT&T Federal Systems, GSBCA 11776-P, 11777-P (TMAC II), 1992 BPD 155, at 20. Indeed, we have upheld a best value determination even where "some elements of the best value analysis lack a rational basis." Computer Sciences Corp., GSBCA 11497-P, 92-1 BCA 24,703, at 123,297, 1992 BPD 6, at 32. There is no formulaic methodology for conducting a best value determination; what matters is that award is consistent with the terms of the solicitation and that any price premium is justified by specific technical enhancements. TMAC II, 1992 BPD 155, at 20. We have held that the agency officials possess considerable discretion in conducting a cost-technical tradeoff analysis. Id. The parameters of that discretion may be limited, however, by the terms of the solicitation as well as by statute and regulation. ProServe Corp., B-247948 (Oct. 5, 1992) slip op. at 4 ("[W]here award is to be based on the best value to the government, as here, a cost/technical tradeoff may be made in selecting an awardee subject only to the test of rationality and consistency with the established evaluation factors."). Here, the ----------- FOOTNOTE BEGINS --------- [foot #] 18 Moreover, although Grumman has vigorously promoted this interpretation of the solicitation in litigation, it does not appear that Grumman had this interpretation in mind when preparing its offer. See Transcript at 969-80. ___ ----------- FOOTNOTE ENDS ----------- solicitation provided very little in the way of limitation on that discretion. With these principles in mind, we turn to the CTTO analysis. Were the Nonquantified Discriminators Reasonable? The Air Force identified four nonquantifiable discriminators in the area of user friendliness: ease of system use/learning, consistent and logical menu layout, system intuitiveness, and GUI/WYSIWYG. Grumman does not challenge the superior rating Contel received in these areas. Instead, Grumman would minimize any enhancements in the Contel system, claiming, for example, that system intuitiveness is less of an issue with commonly used functions and trained users. Even accepting these claims, we find that the distinction between the proposals in "ease of system use" and "intuitiveness" was significant in light of the intended use of the systems by action officers within the Joint Staff, who would constantly be interrupted and be expected to do unfamiliar tasks with some frequency. Similarly, given that the system was to be used by senior officers, the enhanced "ease of learning" was also a significant benefit to Contel's proposal. In reaching this conclusion, we note that the RFP expressly incorporated satisfying "the Air Force's needs" into the overall best value determination. Protest File, Exhibit 54 at 318. While Grumman correctly asserts that both offerors proposed graphical user interfaces and the "what you see is what you get" feature to some extent, there was extensive testimony on both the enhanced benefits of Contel's GUI and its full WYSIWYG and the problems with Grumman's GUI and partial WYSIWYG. Even protester's expert recognized that Contel's system offered enhanced GUI, WYSIWYG and a better help system; his only quarrel was with the extent of the benefit. Grumman also contends that the Air Force's analysis regarding user friendliness was unreasonable in that it failed to take into account the fact that Uniplex was being upgraded. Grumman argued: "The Board has held that components of an offeror's solution that are foreseeably available should be considered in the technical evaluation of a proposal," citing Hughes Advanced Systems, GSBCA 9601-P, 89-1 BCA 21,276, at 107,311, 1988 BPD 253, at 20-21. Protester's Posthearing Brief at 60 n.24. This argument evinces a misunderstanding of the Board's ruling in JSAN I. The Board did not direct the Air Force to reopen the record and evaluate the offers anew, but rather to assess whether the technical enhancements of the Contel solution justified the price premium. JSAN I, GSBCA 11635-P, 1992 BPD 100, at 47. The Uniplex upgrade had not been announced at the time the Air Force performed its initial technical evaluation or at the time Grumman submitted its BAFO. It would have been infeasible for the Air Force to take into account every updated software product in the context of conducting the CTTO, and would not have served the goals of efficient and economic procurement. Nor has Grumman demonstrated that the Air Force's nonquantified discriminators in the system architecture area were unreasonable. Grumman claims that the discriminator "architecture manageability" is unfair and contrary to the RFP's requirements of GOSIP and POSIX compliance and the policy favoring the procurement of open systems. Protester's Posthearing Brief at 73. We disagree. The Air Force's designation of "architecture manageability" as a discriminator stemmed from a legitimate concern that Grumman's multiple hardware platform would be more difficult to learn, administer, and maintain, particularly in the B2 environment. This was reflected in the original SSEB security evaluation as well as in the system architecture evaluation. Protest File, Exhibit 2823. Grumman also challenged the benefit the Air Force attributed to Contel's processor. The SSEB Report pointed to Contel's binary compatibility among all workstations and corporate processors as a strength in systems architecture. Protest File, Exhibit 2823 at 108. In contrast, the Air Force's system architecture item chief explained that he had a serious question about whether Grumman's communications processor as configured could handle the workload that would be put on it in Grumman's design. Transcript at 714. Grumman countered by arguing that the database was not even on the communications processor. Grumman further stated that the cost of upgrading its processor was readily ascertainable and that it could be accomplished at minimal cost, $221,000, and should not have been a significant discriminator. Even if we accept Grumman's arguments, however, they do not eliminate the Air Force's legitimate concern about its processor. In sum, the technical conclusions that Grumman had a more heterogeneous system that would require more maintenance and training and that its corporate processor was potentially less capable of handling various tasks are supported by the original evaluations of the systems as well as by the record as a whole. The Quantifiable Discriminators In challenging the Air Force's quantifiable discriminators, Grumman has pointed out numerous imperfections with the analysis the Air Force performed. We reiterate, however, that our function in reviewing these best value determinations is not to substitute our judgment or the judgment of expert witnesses for the conclusions of agency officials unless those conclusions are demonstrated to be unreasonable or contrary to the solicitation or law, regulation, or a DPA. Quantified Discriminators In User Friendliness Formal Training Although protester's computer expert raised a valid question about the unusually large disparity in the training times between the Contel and Grumman systems, we uphold the Air Force's estimates, giving more credence to the estimates which were based on observation and experience with the Joint Staff's training on comparable systems. Contrary to Grumman's argument, the Air Force estimator did not pick his training estimates out of "thin air"; rather the times he selected were based upon his experience with comparable systems in the very environment for which these systems were being procured. We find that these training estimates were reasonably based on the Air Force's observation of users trained on its JIMS system, which is similar to Grumman's proposed system, and on its standalone workstations, which are similar to Contel's proposed system. The Air Force estimator was an educator, possessed considerable technical expertise in the computer field, and had himself trained users and provided technical assistance to JIMS users. He was thus in a good position to make these judgments. Nor do we agree with Grumman and its computer expert that the Air Force should have adopted the duration of training classes set forth in the offers. Because the RFP required the courses to be of approximate durations, offerors would have taken a chance if they deviated from those estimates.[foot #] 19 In fact, neither Grumman nor Contel did; both offered the Government the duration it had approximated in the RFP. Although there may well have been more accurate ways to estimate the time for formal training, we find the methodology used by the Air Force to have been reasonable. Much time at the hearing was spent on the $29 per hour rate for users at the Joint Staff. Grumman, through its accounting expert, claims that the time should not have been valued at all and the Air Force says the rate should have been higher. The basis of Grumman's expert's opinion was that it was an improper accounting practice for the Air Force to have assigned any hourly rate to the time of these employees because they were not compensated for overtime, and there is no cost to the Government for the increased training time. ----------- FOOTNOTE BEGINS --------- [foot #] 19 CLIN 7010, for example, stated: "This course shall be approximately three days in duration." Protest File, _____ Exhibit 54 at 135 (emphasis added). The other course CLINs contained identical language regarding the course duration. ----------- FOOTNOTE ENDS ----------- The Air Force may have violated some technical accounting principles in doing this aspect of the cost-technical tradeoff. But performing a CTTO is not an exercise (like valuing the assets of a corporation) which requires strict adherence to proper accounting principles. Assigning a dollar value to the time of Joint Staff personnel makes good sense in comparatively evaluating proposals. We thus agree with respondent that an hourly rate should have been applied. We further conclude that the $29 per hour rate was far too low and should have included fringes. Without calculating the precise dollar value of this, we are persuaded that the Contel proposal would result in a significant cost savings to the Air Force in the area of formal training. On-the-Job Training We are also persuaded that the Air Force's time estimates for on-the-job training are reasonable, in that they bear a rational relationship to getting-up-to-speed time for users on similar systems. Transcript at 379-83, 385-86. While the seventy-five percent productivity factor applied over the entire period of getting up to speed may not be mathematically correct, protester has not persuaded us that it is unreasonable. We agree with the notion that there is some time during which users are not operating at their peak proficiency and conclude it was proper to consider this as an element of the tradeoff even if the productivity factor used may not be precisely accurate or mathematically sound. Quantifiable Discriminators in System Administration Workstation Upgrade Grumman contends that the workstation discriminator is contrary to the RFP and evaluation criteria. Protester's Posthearing Brief at 62-65. Grumman's claim is that the RFP's minimum mandatory SPECmarks performance requirements, six for the basic workstation and ten for the advanced, were met by Grumman and that there were no evaluation standards dealing with more powerful workstations, making the more powerful workstations "goldplated." We disagree. The evaluation standards did encompass more powerful workstations. As the Air Force evaluator testified, both the requirement for standalone capabilities and system administration were related to the power of the workstation. The RFP also informed offerors of the need to propose expandable systems in several places, including requirements for flexible processing architecture to allow for future growth, in Paragraphs C.3.1, Advanced JSAN Capabilities and Growth, C.4.5, Technology Improvement, C.5.7 and equipment CLIN C.10, which sought expansion capabilities. Protest File, Exhibit 54 at 46, 48, 50, 63. Grumman also challenges the Air Force's methodology for quantifying the workstation discriminator, claiming the upgrade is both unrealistic and overly expensive. We cannot agree. The assumption that the workstations would not be able to handle the workload in year three when the B2 requirement would come into play with all the other demands being put on these systems was reasonable. Nor can we say that pricing the systems at their first-year cost was unreasonable. We simply do not agree with protester's accounting expert that in year three of an ongoing contract these systems could have been procured competitively, especially given the integration and security requirements. Nor are we persuaded by Grumman's program manager's testimony that the Air Force was required to have considered increasing the processing speed of Grumman's workstations by adding a second CPU board or adding the forthcoming INTEL 586 chip, the P5, or a clock doubling chip. Even if the Air Force's determination that a workstation upgrade was needed in year three was not the best way of quantifying the increased power of the Contel workstations,[foot #] 20 it was not unreasonable. System Administration In JSAN I, Grumman had claimed the Air Force misinterpreted its proposal in the area of system administration time by concluding Grumman had proposed twenty hours per week of human time to administer the system, when in fact Grumman intended the number to be twenty hours of system time. JSAN I, GSBCA 11635-P, 1992 BPD 100 at 46-47. Grumman had further argued that the Air Force should have sought clarification of this number. We disagreed, stating: Clarification as defined by Federal Acquisition Regulation 15.601 "means communication with an offeror for the sole purpose of eliminating minor irregularities, informalities or apparent clerical mistakes in the proposal." 48 CFR 15.601 (1991). Grumman's failure to specify that the "20 hours" meant system time did not fall into any of those three categories. Nor should it have been caught as an obvious mistake, given that the RFP was reasonably interpreted as requiring manhours and that Grumman failed to say what the human time to administer the system per week was anywhere else in its proposal. Id. at 46. We implicitly concluded that the Air Force's evaluation of Grumman's proposal stood. As noted above, in directing the Air Force to perform a CTTO analysis, we did not direct that the technical evaluation be revised in any way -- we simply required an examination of whether the technical enhancements of Contel's ----------- FOOTNOTE BEGINS --------- [foot #] 20 Indeed, respondent's expert would attribute more of a quantified benefit to Contel assuming the Grumman workstations were kept through the life of the contract. ----------- FOOTNOTE ENDS ----------- proposal justified the higher price. Thus, the Air Force was technically correct in considering its original evaluation of system administration time in the CTTO. The bottom line is that the Air Force acted reasonably in concluding that Grumman's system administration time was greater than Contel's. User Support The Air Force determined that Contel's more extensive user support team would allow it to eliminate twelve contracted out personnel currently employed by the Joint Staff for a total savings to the Government of $5.8 million. This conclusion was based upon Contel's more detailed and expensive proposal for user support, and the Air Force's conclusion that Contel was offering a sixteen- to twenty-person user support team while Grumman was offering a two-person "help desk." Grumman claims that it was proposing a man month of user support instead of a team month and thus that its proposal was misevaluated by the Air Force. We reject that contention. If Grumman proposed a man month, it was the one that misread the RFP. The RFP's description of the user support referenced a team and the Section B pricing table expressly solicited the price for a team. Grumman claims the Air Force's conclusion that it offered a two-person help desk was unreasonable since no offeror proposed a number of persons but only a service at the price offered. We find nothing unreasonable in the Air Force's derivation of the number of persons on an offeror's user support team based upon its price and overall proposal. Nor do we have any problem with the Air Force taking this consideration into account in the CTTO. Finally, the cost savings attributed to the discontinuation of the twelve contract personnel currently employed by the Joint Staff was reasonable. The Failure to Apply the Discounts to the Quantifiable Discriminators Grumman is unquestionably correct in its assertion that the Air Force erred in failing to apply the MPC and PV discounts to the quantified discriminators. At first blush this appears to be an egregious error, and Grumman argues that when the discounts are applied, the Grumman proposal becomes the best value since the estimated quantified benefit of Contel's offer is reduced from $42.25 million to $18.72 million. Protester's Exhibit 28, Schedule 2. The question then becomes whether this reduction in the quantified discriminators is sufficient to invalidate the entire CTTO analysis. We conclude that it is not. In looking at the totality of the CTTO analysis, we conclude that there is considerable additional value which may not have been properly quantified in the CTTO analysis but is nonetheless supported by the record: the increased training time for Grumman's system, the increased productivity associated with the more powerful Contel processors, Contel's more extensive user support team, and Contel's lower system administration time. These factors convince us that there is significant value represented by the quantifiable discriminators, which when taken with the nonquantified discriminators, demonstrates the superiority of the Contel proposal and the risks of the Grumman proposal, especially in the paramount area of user friendliness critical to the Joint Staff's needs. We, therefore, conclude that the Air Force's CTTO analysis, flawed as it may have been, did nevertheless support the conclusion that the Contel proposal represents the best value for this procurement.[foot #] 21 Based upon the record of this case and especially the quantifiable and nonquantifiable discriminators and the value we perceive to be obviously inherent in them, we conclude that the SSA's confirmation of the award to Contel was reasonable notwithstanding that proposal's $33.9 million higher price. We emphasize that the CTTO analysis done by the Air Force certainly should not become the standard which agencies strive to meet. The analysis was rife with problems and the analysis and selection of Contel was able to be justified despite the agency's errors -- due in large part to the discretion afforded by the RFP, the emphasis on satisfying the agency's needs, and the fact that the adequacy of the CTTO analysis has been examined here on a de novo basis. The Parameters of De Novo Review Grumman urges the Board to exclude any elements of the CTTO which were "not part of the administrative record at the time the award to Contel was confirmed." Protester's Posthearing Brief at 90-91. Protester further contends that the Board, in reviewing a procurement decision, is limited to information which existed at the time the decision, was made. Id. at 91-92. This gloss on de novo review is simply wrong. See Doe v. United States, 821 F.2d 694, 697 (D.C. Cir. 1987) ("De novo means here, as it ordinarily does, a fresh independent determination of 'the matter' at stake; the court's inquiry is not limited to or constricted by the administrative record, nor is any deference due the agency's conclusion."); B.P.O.A., ASBCA 25276, 82-2 BCA 15,816, at 78,373 ("When a contractor appeals a final decision to this Board our proceedings are de novo. This means that we review ----------- FOOTNOTE BEGINS --------- [foot #] 21 In sustaining this CTTO analysis, we do not accept the SSA's conclusory assertion that the nonquantifiable discriminators drove the decision and that those discriminators alone justify the decision. Rather, our conclusion is based on a recognition that the discriminators in user friendliness, system architecture, and transition which the Air Force attempted to quantify, however inartfully, did constitute significant additional value. ----------- FOOTNOTE ENDS ----------- everything anew and are not bound by what the contracting officer did either in the way of a finding an entitlement or in the way of a quantum award."). In keeping with this standard of review, the Board in this case did consider information which was not before the SSA at the time of his decision confirming award. We also considered explanations about the technical capabilities of the systems and how, in experts' views, to value technical enhancements. Decision The protest is DENIED. The suspension of respondent's delegation of procurement authority lapses by its terms. ______________________________ MARY ELLEN COSTER WILLIAMS Board Judge We concur: ____________________________ ____________________________ EDWIN B. NEILL CATHERINE B. HYATT Board Judge Board Judge