Distributed Data Processing

Definition

Distributed Data Processing describes the degree to which the application transfers data among physical components of the application.

Distributed data or processing functions are a characteristic of the application within the application boundary.

Score

Score As Descriptions To Determine Degree of Influence
0 Data is not transferred or processed on another component of the system.
1 Data is prepared for transfer, then is transferred and processed on another component of the system, for user processing.
2 Data is prepared for transfer, then is transferred and processed on another component of the system ,not for user processing.
3 Distributed processing and data transfer are on-line and in one direction only.
4 Distributed processing and data transfer are on-line and in both directions
5 Distributed processing and data transfer are on-line and are dynamically performed on the most appropriate component of the system.

Hints

Distributed data processing by definition is not an application that is contained on a central processor, which sends data to other applications. In a distributed environment, the application is viewed as requiring multiple components (hardware) on which certain processing or data resides. A knowledgeable user would usually recognize this configuration.

  • Presentation, processing, and I/O components are all in the same place (i.e., stand-alone applications).
  • Application downloads data to a user’s client machine, so the user can use Excel or other reporting tools to prepare graphs and perform other analysis.
  • Process that transfers data from mainframe to an external component for user processing. This transfer is performed using a simple protocol such as FTP.
  • Transferred to a user for processing.
  • Process that transfers data from mainframe to mid-tier. For example;, processing with SAS-PC.
  • Application sends data to client or server. This data is then processed or used to produce reports, etc. No data or confirmation is sent back to the client or server.
  • Transferred to a component for processing.
  • Data is sent between client and server in one direction only. This data is then processed or used to produce reports, etc. by the receiving application. This data typically includes transactions that update an ILF on the client or server.
  • For example – client-server or web-enabled applications.
  • Data is sent between client and server in either direction. This data is then processed or used to produce reports, etc. by the receiving application. This data typically includes transactions that update an ILF on the client or server.
  • For example - client-server or web-enabled applications.
  • The application runs under an operating system that automatically handles the allocation between components, however, the use of the operating system did not influence the design and implementation of the application.
  • The developer must consider special application software that looks at multiple processors and runs the application on a specific type of processor. This is invisible to the user.
  • The application runs under an operating system that automatically handles the dynamic allocation between components, and the use of the operating system specifically influenced the design and implementation of the application.

Typically

  • Many applications, including legacy applications, receive a score of 0
  • Primitive distributed applications that include batch applications in which data is not transferred online on-line receive a score of 1 to 2
  • Client-server or web-based applications receive a score of 3 to 4
  • It is uncommon to score 5
  • There must be multiple servers or processors, each of which would be selected dynamically on the basis of its real-time availability to score 5