Test Data Resources
  1. Simple first examples of some test scripts.
  2. Issues with ellipse specification.

Every program or script should be tested with some form of data. In practice I used to maintain local "T" (Test) directories in each source code subdirectory. This was fine when computing platforms were stable, and the number of codes and their purposes were small. However, many of the codes are designed to process (sometimes) large digital images, and storing large test data sets for each set of source code became difficult. In particular, to address the fragility of computing environments, I have begun transferring source code subdirectories to various machines. If a machine is infected with malware, experiences hardware problems (that may be unfixable), or if the building the machine sits in is subject to bad things (fire, water damage, yadda), the code and suporting test data and documentation are spinning safely elsewhere. My Test_Data_for_Codes project is designed for this.

There are two primary parts of this project:
  1. Collect data files, particularly digital images, that serve as testing input.
  2. Collect run scripts and docucumentation that allow a user to quickly run and understand a procedure.
Hence, the top level of Test_Data_for_Codes (normally stored in my .../projects directory) has the simple form:
 
% pwd
/home/sco/sco/projects/Test_Data_for_Codes

% ls
README.Test_Data_for_Codes  T_images/  T_runs/
The T_images directory holds a collection digital images. These images will hopefully reamin fairly small in size and number. The idea is to tailor my tests to a few basic images. I'll try to break different sets of images into different subdirectories and maintain a terse descript of each set below. Likewise, the T_runs directory will contain scripts and README documents that enable the user to make quick demo runs and understand how a code is executed. The actual scripts and README docs will be stored in local ./S subdirectories. The actual run directories can be anywhere (i.e. just above ./S) and are to be considered volatile space. The output of some codes may be quite large, and carrying all of the results around will become problematic. Finally, I note that on most machines I establish an alias (usually in the .bashrc otr .cshrc file) that allows me to quickly jump to the top level of Test_Data_for_Codes:
 
%  got 

%  echo $tdata 
/home/sco/sco/projects/Test_Data_for_Codes

Additionally, I show above another important feature that I take care of at the login level. I set the system variable named tdata and use it in many run scripts for pulling over data files from appropriate storage repositories. As long as the system "know" the location of Test_Data_for_Codes, then it can access the files it needs to perform a demo.




Back to SCO CODES page