New Performance Measures for Dallas-Fort Worth, Texas System-of-Systems (SoS)

T he purpose of this paper is to propose new performance measures for the local Dallas-Fort Worth, Texas System-of-Systems (SoS). Current traditional performance measures do not integrate the needs of the local Dallas-Fort Worth, Texas SoS and the needs of for-profit organizations, government agencies and educational institutions. As sustainability concerns and issues become pervasive, relationship between productivity/efficiency and effectiveness in the local Dallas-Fort Worth, Texas SoS becomes very important. Based on the results of an on-line survey and using the Balanced Scorecard Template, three new performance measures are proposed: 1) Cost/Logistics Index, 2) Performance/Logistics Index and 3) Availability/Logistics Index. A comparison of these new performance measures versus current performance measures is discussed.


Introduction
Performance measure is a numeric description used in for-profit organizations, government agencies and education institutions to determine the efficiency and effectiveness of such organizations. Effective performance measures, quantitatively, describe important and key information about services and how they are produced. A good performance measures are typically based on data and can help whether an organization is achieving its objectives and the improvement being made to meeting organizational goals. Performance measures usually let organizations know followings: 1) are they currently performing with regards to their respective competitors, 2) are they meeting their current goals, 3) are processes controlled, 4) are improvements necessary, and 5) is there information to make intelligent decisions [1].
A performance measure is a quantifiable expression composed of some number or amount and a unit of measure. The number or amount is a magnitude (how much) and the unit of measure is a meaning (what). In general, performance measures can be categorized into one of the following six general areas: 2. Quality -It shows the degree to which a product or service meets customer requirements and expectations (Crosby, et al., 1992 [3]).
3. Timeliness -Whether a unit of work was done properly and on time.
4. Productivity -is the value added by the process divided by the value of the labor and capital consumed (Tangen, 2002 [4]).
5. Safety -Defines the overall health of the organization and the working environment of the employees (Maudgalya, et al., 2008 [5]). 6. Effectiveness -Compares the profit from new products to the investment in new product development (McGrath and Romeri, 1994 [6]).
Current performance measures have been discussed by many researchers. As sustainability concerns and issues become pervasive, relationship between productivity/efficiency and effectiveness in the local Metroplex SoS becomes extremely important. Kajikawa ( 2010) [7] studied the interdisciplinary characteristics of sustainability science and discussed the main issues to integrated disciplines. He analyzed the structure of sustainability science and considered the necessity of transdisciplinary expertise and how such initiatives were conducted in Japan. Valerdi 3. Traditional Systems Engineering's cost is a single stakeholder group with a stable funding profile. SoSE's cost is multiple stakeholder groups with an unstable funding profile. Baruch and Ramahlo (2006) [9] explain how academic scholarly works measure organizational outcomes which are reported as either organizational effectiveness or organizational performance (OEP). They analyzed 149 scholarly articles published in the past decade which centered on business organizations, nonprofit organizations and a combination of both. Searcy (2009) [10] explained how the System of Systems (SoS) Engineering perspective is used to corporate sustainability performance measurement. He indicated that measuring corporate sustainability is a difficult problem characterized by uncertainty and ambiguity. Doolen, Traxler and McBride (2006) [11] suggested a supplier score card to help improve supplier performance. They proposed the following five-steps for a supplier scorecard:  [12] proposed a technique used to evaluate trade-off issues between performance measures concerning which performance measures are better than others. They suggested the use of the multi-attribute utility theory when developing a balanced scorecard. Brooks and Coleman (2003) [13] discussed a process that could be used to evaluate an organization's performance measurement system. Specifically, they proposed a process 11consisting of three steps, plan-do-act". Keating, et al. (2008) [14] studied how to derive and decompose system-level requirements from a SoSE viewpoint. They describe the main differences that exist between Systems Engineering (SE) and SoSE requirement domains, how SE and SoSE requirements are derived and decomposed, what are the current problems with SoSE requirements. Keating, et al. (2003) [15] proposed concepts and implications for SoSE. They believe that SoSE needs disciplined approaches, must involve a different level of thinking than is currently used for traditiona SE, and best practices must be captured during the SoSE life cycle.
DiMario, et al. (2008) [16] proposed a framework that addresses SoS complexities and its challenges. They use the Zachman Framework to address SoS architecture issues. Elrod, et al. (2013) [17] discussed what performance measures Supply Chain Managers can use for their particular business and services. Supply Chain Managers play key roles in many for-profit companies and they discussed those performance measures that improve the overall supply chain process.
Laihnonen, et al. (2014) [18] studied a conceptual framework for capturing performance of a service system by applying it to two distinct service systems. Mehrabad, et al. (2012) [19] discussed the development of predictive tools in performance measurement and how to model it to help managers target performance measures based on achieving minimum cost and strategic priorities. Halachmi (2005) [20] proposed that performance management can take many forms when dealing with internal issues from an organization and that performance measures are a possible sub-system of performance management. Psomas and Kafetzopoulos (2014) [21] compare ISO 9001 certified and noncertified manufacturing companies with regards to performance measures, both financial and non-financial. Rossiter and Pantano (2006) [22] evaluated the Balanced Scorecard by listing claims made by its authors and to justify further research in 11dynamic performance measurement systems for global organizations." Gosselin (2005) [23] discussed that manufacturing firms continue to use financial performance measures, despite recommendations from experts and academics.
This paper reviews the traditional performance measures used by for-profit organizations, government agencies and educational institutions. The relation of firms, government agencies and educational institutions is phrased in a Systems of System (SoS) perspective, with a discussion of the forces for parasitic behavior and symbiotic behavior.

Traditional Performance Measures
For the purposes of this paper, the focus of the local SoS is the "Metroplex" metropolitan area of Dallas-Fort Worth, Texas. The Dallas-Fort Worth Metroplex is home to a high number of technology companies (Raytheon, Lockheed Martin, Cisco, Oracle, Texas Instruments, etc.) and research universities including the University of Texas-Dallas (UTD), University of Texas-Arlington (UTA), Southern Methodist University (SMU), University of North Texas-Denton (UNT) and Texas Christian University (TCU) and community and technical colleges like Collin County Community College (CCCC) and Tarrant County Community College (TCCC). For the Dallas-Fort Worth Metroplex local SoS, interactions are very important. These interactions can be either parasitic, where the success of one requires the sacrifice of another. Or they can be symbiotic, where success for one fuels success in the others. As sustainability concerns become more prevalent, the relationship of efficiency and effectiveness in this local SoS is crucial to ensure it is symbiotic. Sustainability is continuing global issue and gindustry, educational institutions and government agencies must continue to address it.
The current performance measures in industry, government and educational institutions include Data Envelopment Analysis (DEA) and Evidential Reasoning (ER). Chen, et al. (2010, p. 2) [24] state "DEA is a mathematical programming method for evaluating firms' productive efficiency that has been used considerably in the operations research and management literature." Borhan and Jemain, (2012) [25] summarize that ER focuses on the "evidential reasoning algorithm and is different from many conventional Multi Criteria Decision Making modeling methods in that it uses an evidence-based reasoning process to reach a conclusion". They believed that ER has many advantages, as it can handle: 1) a mixture of quantitative and qualitative information, 2) a mixture of deterministic and random information, 3) incomplete information and 4) a large number of attributes and alternatives.
The research design and methods for identifying the relation of these systems from a SoS perspective and discussing how parasitic and symbiotic behaviors affect it is prescribed in the U.S. DoD Systems Engineering Guide for Systems-of-Systems Engineering Version 1, (2008). The U.S. DoD SoS Engineering defines four types of SoS as follows: 1. Directed SoS are those in which the integrated SoS is built and managed to fulfill those purposes as well as any new ones the system owners might wish to address. The component systems maintain an ability to operate independently, but their normal operational mode is subordinated to the central managed purpose.
2. Acknowledged SoS have recognized objectives, a designated manager, and resources for the SoS; however, the constituent systems retain their independent ownership, objectives, funding, and development and sustainment approaches. Changes in the system are based on collaboration between the SoS and the system. For identifying and proposing performance measures that integrate the needs of these various SoS with regards to sustainability, Neely, et al. (2000) [26] recommend a process-based approach that has desirable characteristics of a performance measurement system design process. Particularly, they believed these performance measures must: 1) be understandable, 2) ensure calculation method(s) is very clear, 3) ensure data collection is understandable, 4) enable/facilitate benchmarking, 5) prefer ratio-based performance measures to absolute numbers, 6) be easy to understand, 7) provide fast feedback and 8) stimulate continuous improvement. Neely, et al. (1997) [27] recommended a structured approach which specifies what a performance measure constitutes: 1) performance measures should be derived from strategy, 2) performance measures should be relevant, 3) performance measures should relate to specific goals or targets, 4) performance measures should be consistent in that they maintain their significance as time passes, 5) performance measures should be reported in a simple consistent format, 6) performance measures should provide relevant information, 7) performance measures should be precise and 8) performance measures should be objective.

On-line Survey
An on-line survey was sent to Dallas-Fort Worth SoS City Managers, County Commissioners, Universities, Community Colleges, not-for-profit and for-profit companies. 90 respondents out of 100 completed it. The on-line survey developed by the author for this paper was used to understand what performance measures are currently collected and what performance measurement models or tools are currently being implemented. Question 1 asked the definition of your current organization (i.e. Educational Institution (Community College or University), For-Profit or Non-Profit Company, Government Organization (City, County, State or Federal). Question 2 asked the number of employees in their respective current organization. Question 3 was "What kind(s) of performance measurement is (are) implemented in your organization" offered interesting responses. Question 4 was "Which performance measurement model or tools are used in your organization." Question 5 asked "What are the initial reasons for your organization to implement its performance measurement system?" Question 6 asked to "List the top five most important performance measures in your organization and to define them". Question 7 asked "For any of the top five most important performance measures listed in Question 6, are the publicly accessible?"   Figure 1, It is not surprising that Financial Performance was the top performance measure since all these SoSs realize that having satisfactory financial performance ensures they stay in business. The results for Questions 4 and 5 are given in Figures 2 and 3.
For Question 6, all on-line survey respondents listed the following: a. Cost/Financial Measure

Discussions
An important finding was the literature review indicated a lack of research in productivity for a SoS. Also, the literature review indicated that there has not been a systematic study of performance measures in a SoS that encompasses for-profit organizations, governmental agencies and educational institutions. This gap identifies an important sector for further research. The Cost/Logistics Index measures the actual cost of the SoS versus the logistics information. It is calculated as follows: (Actual cost/Budgeted cost)/(Logistics Performance Index). For each system in the Dallas-Fort Worth SoS, you take the monthly actual costs expended and divide it by the budgeted costs. Then you divide this value by the Logistics Performance Index. There were 90 respondents in the on-line survey. Assigning Cost Performance Index (cumulative-to-date) CPICTD data from the DoD selected acquisition reports (SARs) and the logistics performance index (LPI) scores from the World Bank to match each system in the Dallas-Fort Worth SoS as close as possible, the Cost/Logistics Index measures for each of the 90 respondents is shown in Figure 4 and Figure 5. Figure 4 shows the Cost/Logistics Index for fifty-four systems in the Dallas-Fort Worth, TX SoS and Figure 5 shows the Cost/Logistics Index for thirty-six systems in the Dallas-Fort Worth, TX SoS.
This data shows that the higher the Cost Index (i.e. greater than 1.0) and the higher the LPI (i.e. scores greater than 3.5 or higher) the lower the Cost/Logistic Index measure, which is desirable . The lower Cost Logistics Index Measures are good scores while the higher scores are not. The scores range from 0.245 to 0.644. If we sum all the ninety Cost Logistics Index Measures and divide by ninety, we will have an average Cost Logistic Index Measure = 0.3478 for the Dallas-Fort Worth, TX SoS. This means that this measure is a good indicator for measuring the Dallas-Fort Worth, TX SoS from a Cost/Logistics viewpoint.  The lower the value implies the SoS is performing within its cost budget while also managing the logistics that come with it.
The Performance/Logistics Index measures the performance of the Dallas-Fort Worth, TX SoS versus the logistics information. For each system in the Dallas-Fort Worth, TX SoS, performance is defined as the (system operational time/system non-operational time)/(Logistics Performance Index). For each system in the Dallas-Fort Worth SoS, we take the system operational and divide it by the system non-operational time. Then this value is divided by the Logistics Performance Index. There were 90 respondents in the on-line survey. If we assign values from 0 to 1.0 for each system performance in the Dallas-Fort Worth, TX SoS and the LPI scores from the World Bank to match each system in the Dallas-Fort Worth SoS as close as possible, the Performance/Logistics Index measures for each of the 90 respondents is shown in Figure 63 and Figure 7. Figure 6 shows the Performance/Logistics Index for fifty-four systems in the Dallas-Fort Worth, TX SoS and Figure 7 shows the Performance/Logistics Index for thirty-six systems in the Dallas-Fort Worth, TX SoS.
This data shows that the higher the Performance Index (i.e. values approaching 1.0) and the higher the  What this measure tells us is that it is a good indicator for measuring the Dallas-Fort Worth, TX SoS from a Performance/Logistics viewpoint. The lower the value implies the SoS is performing as required while also managing the logistics that come with it. The Availability/Logistics Index measures the availability of the Dallas-Fort Worth, TX SoS versus the logistics information. For each system in the Dallas-Fort Worth, TX SoS, Availability is defined as the (system uptime/system downtime)/(Logistics Performance Index). For each system in the Dallas-Fort Worth SoS, we take the system uptime and divide it by the system downtime. This value is divided by the Logistics Performance Index. If we I assign values from 0 to 1.0 for each system availability in the Dallas-Fort Worth, TX SoS and the LPI scores from the World Bank to match each system in the Dallas-Fort Worth SoS as close as possible, the Availability /Logistics Index measures for each of the 90 respondents is shown in Figure 8 and Figure 96. Figure 8 shows the Availability/Logistics Index for fifty-four systems in the Dallas-Fort Worth, TX SoS and Figure 9 shows the Availability/Logistics Index for thirty-six systems in the Dallas-Fort Worth, TX SoS.
This data shows that the higher the Availability Index (i.e. values approaching 1.0) and the higher the LPI (i.e. scores greater than 3.5 or higher) the lower the Availability/Logistic Index measure, which is desirable . The lower Performance Logistics Index Measures are good scores while the higher scores are not. The scores range from 0.212 to 0.418. If we sum all the ninety Availability/Logistics Index Measures and divide by ninety, you have an average Availability/Logistic Index Measure = 0.278 for the Dallas-Fort Worth, TX SoS. This measure tells us that it is a good indicator for measuring the Dallas-Fort Worth, TX SoS from a Performance/Logistics view point. The lower the value implies the SoS is performing as required while also managing the logistics that come with it.

Conclusions
Based on the data and survey results collected and using the BSC Template, the following performance measures are proposed for the Dallas-Fort Worth, Texas SoS. These three performance measures can measure cost, performance and availability for each system in a SoS with regards to sustainability, whereas current measures do not. On a scale from 0 to 0.99, the lower the scores for each measure, the better each system in a SoS can perform (cost, performance and availability) with regards to sustainability. For example, in the Dallas-Fort Worth, TX SoS, El Centro Community College has the #1 CLI = 0.24, #12 PLI = 0.22 and #14 ALI = 0.23. Since El Centro Community College is an educational institution, we can research why an educational institution in the Dallas-Fort Worth, TX SoS is ranked very high in these three performance measures. For the non-profit system, Global Future Institute has the #90 CLI = 0.64, #90 PLI = 0.37 and #89 ALI = 0.41. The question to ask is, "Why is this non-profit system ranked low in these three performance measure?" For a profit system, L3 Mustang Technology has the #2 CLI = 0.24, #18 PLI = 0.22 and #10 ALI = 0.22. For a governmental institution, City of Dallas, TX has the #5 CLI = 0.25, #23 PLI = 0.23, #18 ALI = 0.23. Each performance measures indicates how that particular system in a SoS is performing in terms of cost, performance and availability with regards to sustainability.
The three proposed measures can measure any geographic SoS while traditional measures do not. For example, these three performance measures could be used to perform a SoS study comparing the geographic SoS of a community that attracts an existing company at the expense of a community that loses that company. Another example could be, if Texas Instruments, which has its corporate headquarters in the Dallas-Fort Worth, TX SoS, decides to relocate to the Houston, TX SoS, these three performance measures could be used to explain or predict these types of move.
Funding: This research received no external funding.

Conflicts of Interest:
The author declares no conflict of interest. directing projects and programs at Raytheon Company. Dr. Ilseng a resource expertise in specialty and systems engineering; technical and program management; proposal preparation and management; new product development and design through production, test and evaluation, qualification, and compliance. He has experience in process development and implementation, ensuring and verifying product and system integrity, and encouraging creative and effective solutions.