Issue
For any company, and IT within it, the problem arises if we approach
a large number of applications in the traditional way, which is to
make special applications for every research/project. It may then
happen that applications are developed using different platforms
(VB, C++, C#, Java...), that data is stored and processed in different places,
from the server to the local machine, in various formats, from relational database to text files.
Dependence on various tehnical platforms and vendors is extrimely
high and data chaos is in place. Business processes are slow and
hard to control, burden on of human resources on redudant proceses
grows and response to user needs is slowing down.
Data processing and process automation, not only in statistics,
but anywhere, always consist of three main phases: data entry,
data editing and finally data processing with report generation.
Looking at things this way, we realize that we constantly write
the same three programs.
Is it possible:
- to have environment not dependent on constant technological changes?
- to conduct statistical surveys with minimal efforts for programming... or even better, with no real programming at all?
- to store applications in databases, besides storing data?
- to be more resistant to IT staff turn over and reduce costs for IT licences?