THESIS CHAPTER 2 - REVIEW OF RELATED LITERATURE
REVIEW OF RELATED LITERATURE
The researcher has found the following studies and literature as relevant to the system being proposed.
A. Related Literature
The rapid growth in the creation and dissemination of digital objects by authors, publishers, corporations, governments, and even librarians, archivists, and museum curators, has emphasized the speed and ease of short-term dissemination with little regard for the long-term preservation of digital information. However, digital information is fragile in ways that differ from traditional technologies, such as paper or microfilm. It is more easily corrupted or altered without recognition. Digital storage media have shorter life spans, and digital information requires access technologies that are changing at an ever-increasing pace. Some types of information, such as multimedia, are so closely linked to the software and hardware technologies that they cannot be used outside these proprietary environments [ 1998]. Because of the speed of technological advances, the time frame in which we must consider archiving becomes much shorter. The time between manufacture and preservation is shrinking.
While there are traditions of
stewardship and best practices that have become institutionalized in the print
environment, many of these traditions are inadequate, inappropriate or not well
known among the stakeholders in the digital environment. Originators are able to
bypass the traditional publishing, dissemination, and announcement processes
that are part of the traditional path from creation to archiving and
preservation. Groups and individuals who did not previously consider themselves
to be archivists are now being drawn into the role, either because of the
infrastructure and intellectual property issues involved or because user groups
are demanding it. Librarians and archivists who traditionally managed the life
cycle of print information from creation to long-term preservation and archiving
must now look to information managers from the computer-science tradition to
support the development of a system of stewardship in the new digital
environment. There is a need to identify new best practices that satisfy the
requirements and are practical for the various stakeholder groups involved.
In an effort to advance the state of the art and practice of digital archiving, the International Council for Scientific and Technical Information (ICSTI), a community of scientific and technical information organizations that includes national libraries, research institutes, publishers, and bibliographic database producers, sponsored a study in March 1999 [Hodge 1999]. This study is the most recent in a series of efforts on the part of ICSTI to highlight the importance of digital archiving. The topic was first raised in the joint UNESCO/International Council of Scientific Unions (ICSU) Conference on Electronic Publishing in 1996. The topic was highlighted at the technical session of the June 1997 Annual ICSTI meeting and a working group was formed. The Electronic Publications Archive Working Group presented a white paper of the major issues in December 1998 [ 1998]. At its December 1998 meeting, the ICSTI Board approved the study on which this report is based. Based on common interest in this topic, [CENDI], an interagency working group of scientific and technical information managers in the U.S. federal government, cosponsored the study.
The project managers from the cutting-edge projects emphasized the importance of considering best practices for archiving at all stages of the information management life cycle. Acknowledging this important philosophy, the best practices identified by the study are presented in the framework of the information life cycle, creation, acquisition, cataloging/identification, storage, preservation and access.
Creation is the act of producing the information product. The producer may be a human author or originator, or a piece of equipment such as a sensing device, satellite, or laboratory instrument. Creation is viewed here in the broadest sense, as science is based increasingly on a variety of data types, products, and originators. All project managers acknowledged that creation is where long-term archiving and preservation must start. Even in rigorously controlled situations, the digital information may be lost without the initial awareness on the part of the originator of the importance of archiving. Practices used when a digital object is created ultimately impact the ease with which the object can be digitally archived and preserved
there are several key practices involving the creator that are evolving within
the archiving projects. First, the creator may be involved in assessing the
long-term value of the information. In lieu of other assessment factors, the
creator's estimate of the long-term value of the information may be a good
indication of the value that will be placed on it by people within the same
discipline or area of research in the future. The U.S. Department of
Agriculture's Digital Publications Preservation Steering Committee has suggested
that the creator provide a preservation indicator in the document. This
would not take the place of formal retention schedules, but it would provide an
indication of the long-term value that the creator, as a practicing researcher,
attaches to the document's contents. Secondly, the preservation and archiving
process is made more efficient when attention is paid to issues of consistency,
format, standardization, and metadata description in the very beginning of the
information life cycle. The Oak Ridge National Laboratory (, USA) recently
announced guidelines for the creation of digital documents. Limits are placed on
both the software that can be used and on the format and layout of the documents
in order to make short and long-term information management easier.
Many project managers acknowledged that the best practice would be to create the metadata at the object-creation stage, or to create the metadata in stages, with the metadata provided at creation augmented by additional elements during the cataloging/identification stage. However, only in the case of data objects is the metadata routinely collected at the point of creation. Many of the datasets are created by measurement or monitoring instruments, and the metadata is supplied along with the data stream. This metadata may include location, instrument type, and other quality indicators concerning the context of the measurement. In some cases, this instrument-generated metadata is supplemented by information provided by the original researcher.
For smaller datasets and other objects such as documents and images, much of the metadata continues to be created by hand and after the fact. Metadata creation is not sufficiently incorporated into the tools for the creation of these objects to rely solely on the creation process. As standards groups and vendors move to incorporate XML (eXtensible Mark-up Language) and RDF (Resource Description Framework) architectures in their word-processing and database products, the creation of metadata as part of the origination of the object will be easier.
Acquisition and collection development is the stage in which the created object is "incorporated" physically or virtually into the archive. The object must be known to the archive administration. There are two main aspects to the acquisition of digital objects, collection policies and gathering procedures. In most countries, the major difference in collection policies between formal print and electronic publications is the question of whether digital materials are included under current deposit legislation. Guidelines help to establish the boundaries in such an unregulated situation. It is also the case that there is just too much material that could be archived from the Internet, so guidelines are needed to tailor the general collection practices of the organization. The collection policies answer questions related to selecting what to archive, determining extent, archiving links, and refreshing site contents.
Storage is often treated as a passive stage in the life cycle, but storage media and formats have changed, with legacy information perhaps lost forever. Block sizes, tape sizes, tape drive mechanisms, and operating systems have changed over time. Most organizations that responded to the question about the periodicity of media migration anticipate a three- to five-year cycle.
The most common solution to this problem of changing storage media is migration to new storage systems. This is expensive, and there is always concern about the loss of data or problems with the quality when a transfer is made. Check algorithms are extremely important when this approach is used.
Preservation is the aspect of archival management that preserves the content as well as the look and feel of the digital object. While the study showed that there is no common agreement on the definition of long-term preservation, the time frame can be thought of as long enough to be concerned about changes in technology and changes in the user community. Depending on the particular technologies and subject disciplines involved, the project managers interviewed estimated the cycle for hardware/software migration at two to ten years.
New releases of databases, spreadsheets, and word processors can be expected at least every two to three years, with patches and minor updates released more often. While software vendors generally provide migration strategies or upward compatibility for some generations of their products, this may not be true beyond one or two generations. Migration is not guaranteed to work for all data types, and it becomes particularly unreliable if the information product has used sophisticated software features. There is generally no backward compatibility, and if backward compatability is possible, there is certainly loss of integrity in the result.
Plans are less rigorous for migrating to new hardware and applications software than for storage media. In order to guard against major hardware/software migration issues, the organizations try to procure mainstream commercial technologies. For example, both the American Chemical Society and the U.S. Environmental Protection Agency purchased Oracle not only for its data management capabilities but for the company's longevity and ability to impact standards development. Unfortunately, this level of standardization and ease of migration is not as readily available among technologies used in specialized fields where niche systems are required because of the interfaces to instrumentation and the volume of data to be stored and manipulated.
Emulation, which encapsulates the behavior of the hardware/software with the object, is being considered as an alternative to migration. For example, a MS Word 2000 document would carry metadata information that indicates how to reconstruct the document and the MS Word 2000 software environment at the bit and byte level. An alternative to encapsulating the software with every instance of the data type is to create an emulation registry that uniquely identifies the hardware and software environments and provides information on how to recreate the environment in order to preserve the use of the digital object. [ 1998; 1999]
At this time, there is no system in place to provide the extensive documentation and emulation information required for this approach to be operable, particularly to allow an archive to deal with the variety of older technologies. Most importantly, there is no policy that requires the manufacturers to deposit the emulation information. The best practice for the foreseeable future will be migration to new hardware and software platforms; emulation will begin to be used if and when the hardware and software industries begin to endorse it.
For purely electronic documents, PDF is the most prevalent format. This provides a replica of the Postscript format of the document, but relies upon proprietary encoding technologies. PDF is used both for formal publications and grey literature. The National Library of Sweden transforms dissertations that are received in formats other than PDF to PDF and HTML. While PDF is increasingly accepted, concerns remain for long-term preservation, and it may not be accepted as a legal depository format because of its proprietary nature.
Preserving the "look and feel" is difficult in the text environment, but it is even more difficult in the multimedia environment, where there is a tightly coupled interplay between software, hardware, and content. The U.S. Department of Defense DITT Project is developing models and software for the management of multimedia objects. Similarly, the University of California at San Diego has developed a model for object-based archiving that allows various levels and types of metadata with distributed storage of various data types. The UCSD work is funded by the U.S. National Archives and Records Administration and the U.S. Patent and Trademark Office.
A key preservation issue is the format in which the archival version should be stored. Transformation is the process of converting the native format to a standard format. On the whole, the projects reviewed favored storage in native formats. However, there are several examples of data transformation. AAS and ACS transform the incoming files into SGML-tagged ASCII format. The AAS believes that "The electronic master copy, if done well, is able to serve as the robust electronic archival copy. Such a well-tagged copy can be updated periodically, at very little cost, to take advantage of advances in both technology and standards. The content remains unchanged, but the public electronic version can be updated to remain compatible with the advances in browsers and other access technology." [ 1997]
community also provides some examples of data transformation. For example, the
NASA Data Active Archive Centers (DAACs) transform incoming satellite and
ground-monitoring information into standard Common Data Format. The U.K.'s
National Digital Archive of Datasets (NDAD) transforms the native format into
one of its own devising, since NDAD could not find an existing standard that
dealt with all their metadata needs. These transformed formats are considered to
be the archival versions, but the bit-wise copies are retained, so that someone
can replicate what the center has done.
In some countries, there are intellectual property questions related to native versus transformed formats. According to , an author's rights are infringed if the original work is "distorted, mutilated or otherwise modified." After much discussion, the NLC decided that converting an electronic publication to a standard format to preserve the quality of the original and to ensure long-term access does not infringe on the author's right of integrity. However, this assumption has not been tested in court
The previous life-cycle functions that have been discussed are performed for the purpose of ensuring continuous access to the material in the archive. Successful practices must consider changes to access mechanisms, as well as rights management and security requirements over the long term.
One of the most difficult access issues for digital archiving involves rights management. What rights does the archive have? What rights do various user groups have? What rights have the owners retained? How will the access mechanism interact with the archive's metadata to ensure that these rights are managed properly? Rights management includes providing or restricting access as appropriate, and changing the access rights as the material's copyright and security level changes.
Security and version control also impact digital archiving. raises many interesting questions concerning privacy and "stolen information," particularly since the Internet Archive policy is to archive all sites that are linked to one another in one long chain. . Similarly, there is concern among image archivists that images can be tampered with without the tampering being detected. Particularly in cases where conservation issues are at stake, it is important to have metadata to manage encryption, watermarks, digital signatures, etc. that can survive despite changes in the format and media on which the digital item is stored.
(1996) pointed out in a lecture that a systematic acquisition work begin with knowing what should be done and by whom. It is evident that professional archivists are responsible for duties such as managing the acquisitions budget, determining policies and work procedures, and resolving difficult bibliographic problems. She states further that the most routine tasks are usually left to the clerk and assistants. It means that in whatever level, it is advantageous for the personnel to know and understand as much as possible about all acquisition tasks.
(1994) affirms that technical services must be transformed through the innovativeness of its leaders. This is so if archives want to become active agents of the electronic revolution with the responsibility of collecting information in all forms. Her paper presented the past and present archival practices and visions of an electronic library equipment. The working environments in the past demand a variety of challenges. However, she emphasizes that knowledge of the basic concepts and process on technical services is still a must whether it is an electronic library, or a library in an information technology environment. Despite the changes, the basic process of performing archival activities remains constant and what has changed is how the work is done.
(1994) discussed the future of archiving and the needed leadership to cope with the developments in the field. He recommends the following: 1. To approach the age of the digital archival system, catalogers must have a grasp of the rules on cataloging files and other non-traditional media, and 2) learn to use microcomputers and explore its endless possibilities to improve procedures.
(1993) stated in her study that the goals of nonprint materials must be consciously and systematically determined through an assessment of user needs. Its database must consist of the widest range of available learning resources and must be selected according to a profile of user needs. Its retrieval system must reflect the objectives, interest learning and teaching styles and abilities of the school community and its users must be educated to make active use of media materials by making the responsibility to acquire, store, retrieve and disseminate information systematically.
The study used the descriptive method of research with two sets of questionnaires as her instruments in gathering data. The result of the survey indicated three essential findings: that there is a need to establish a media center responsible for the organization and management of audio visual resource in order to fulfill its vital role as partner in the school’s instructional program; that the utilization of newer media resource is not maximized due to lack of awareness by the faculty of their existence; and that there is an imperative need to organize the audiovisuals in the school for more effective and efficient use. A union list of AV materials was urgently needed; and there is a need to formulate procedures and policies in the overall management of media or audiovisual resources in the library.
(2001) pointed out that the Philippines employed a unique strategy in carrying out the work in audiovisual (AV) archiving. In the absence of an operational AV archives in the country, most of the activities of institutions with AV archive holdings such as the Philippine Information Agency, the UP Film Center, the Cultural Center of the Philippines, Mowelfund, etc., are informally being coordinated by a professional body composed of "archivists" working in the different institutions involved in archiving.
It should be noted though that during the early 1980’s, the Philippines had a national film archive that was fully operational in the real sense of the word. After the 1986 Edsa Revolution, the archive received the least priority and was transferred to the Censors Body. No archiving activity was carried out by this unit since then. The various tasks involved in AV archiving rested on the shoulders of related agencies, not by their own chose but by necessity required of their work. AV archiving activities were separately being undertaken. The turning point came when the Cultural Center of the Philippines, which wanted to organize a Lino Brocka Retrospective, discovered that many of the Brocka films, particularly those that were significant, were already destroyed. Some of the surviving prints were brought to the Philippine Information Agency for restoration. With no previous experience on cases of this nature, PIA had to turn to other agencies for materials that could be used as reference for the work. The UP Film Center, with its contacts with international organizations and as a recipient of several fellowship awards from UNESCO, provided PIA a copy of the International Federation of Film Archives (FIAF) Manual. This was the first inter-institutional contact in the country that proved to be significant as it eventually led to a formal partnership among the archivists from these institutions in what was to become later as the Society of Film Archivists (SOFIA).
The Society of Film Archivists (SOFIA) was formally organized in July, 1993 out of need and in compliance with the recommendation of the ASEAN Planning Workshop Meeting on Film Retrieval, Restoration and Archiving, which the Philippines organized in February, 1996. SOFIA’s membership is made up of individual AV archivists working in cultural organizations, information agencies, broadcasting and academic institutions, film production companies, and film critics.
The role of SOFIA in the development of the AV archiving profession in the country is one that has been evolving over time. At the beginning, SOFIA served as a venue for exchanging knowledge and skills. As such, the kinds of projects undertaken during this stage were mostly training programs. Later, SOFIA became an informal coordinating body for activities undertaken by the different institutions involved in archiving. These projects were implemented by institutions as part of their in-house programs and according to what these institutions can do best. This enabled the different institutions to avoid duplication of efforts and therefore saved on scarce national resources. Presently, SOFIA’s role has evolved into that of a mobilizer and an advocate for the setting up of a central body that would formally be recognized by the government and supported by the public sector.
The effectiveness of this model in the Philippines could be better appreciated in the context of the prevailing situation operating at the time SOFIA was born and the commitment of the members to pursue the work. Prevailing conditions that were considered important that helped bring about the results were the following: a) a clear vision of what was needed.
These involved knowing where we were, where we want to go and how to get there; b) the emergence of a Steering Group at the initial stage. This role was assumed by the Philippine Information Agency. Through its work requirement and involvement in ASEAN, PIA initiated projects on the national and regional levels that brought critical agencies together, identified local partners such as the Cultural Center of the Philippines, the UP Film Center, etc., to assist in implementing the program, and identify succeeding steps to build on previous accomplishments; c) empowerment of the identified partners. This involved getting the partners committed to assume the responsibility of carrying out the work when the agencies involved in AV archiving decided to bond together through SOFIA and initiated and sustained programs and projects for AV archiving as a group or individually in coordination with each other, empowerment of the identified partners had set in; and d) utilization of advocacy techniques that were found effective. Apart from the traditional tools for promotion such as the media, SOFIA used techniques such as requesting the President of the Republic of the Philippines to issue an Executive Order declaring a week in a year as AV Archiving Week, among others. Organizing international conferences also helped generate interest on the issues.
This is an overview of the criminal procedure in the Philippines. The sources of procedural criminal law were the constitution, the revised penal code of 1930, the New Rules of Court of 1964, special laws, and certain presidential orders and letters of instruction. These governed the pleading, practice, and procedure of all courts as well as admission to the practice of law. All had the force and effect of law.
The rights of the accused under Philippine law are guaranteed under Article 3 of the 1987 constitution and include the right to be presumed innocent until proven guilty, the right to enjoy due process under the law, and the right to a speedy, public trial. Those accused must be informed of the charges against them and must be given access to competent, independent counsel, and the opportunity to post bail, except in instances where there is strong evidence that the crime could result in the maximum punishment of life imprisonment. Habeas corpus protection is extended to all except in cases of invasion or rebellion. During a trial, the accused are entitled to be present at every proceeding, to compel witnesses, to testify and cross-examine them and to testify or be exempt as a witness. Finally, all are guaranteed freedom from double jeopardy and, if convicted, the right to appeal.
Criminal action can be initiated either by a complaint--a sworn statement by the offended party, a witness, or a police officer--or by "information." Information consists of a written accusation filed with the court by a prosecutor, known as a fiscal at the provincial levels of government and below. No information can be filed unless investigation by a judge, fiscal, or state prosecutor establishes a prima facie case. Warrant for arrest is issued by a judge. Warrantless arrest by a police officer can be made legally only under extraordinary circumstances. Former President Aquino immediately discontinued Marcos-era practices of presidentially ordered searches and arrests without judicial process and prolonged "preventative detention actions."
Trial procedure consists of arraignment, trial, and the court's judgment and sentencing. The accused must be arraigned in the court where the complaint or information is filed. A defendant must be present to plead to the charge, except in certain minor cases where a lawyer can appear for him or her. All offenses are bailable, save the most serious cases when strong evidence of guilt exists. If a defendant has no lawyer, the court is required to supply one. Prosecution is carried out by the state prosecutor or provincial fiscal, who exercises broad discretion in screening cases and affixing charges. No jury is employed; the judge determines all questions of law and fact and passes sentence. A written sentence must be read to the court. Afterward, either party may appeal. ()
This is the overview of law enforcement in the Philippines. Until the mid-1970s, when a major restructuring of the nation's police system was undertaken, the Philippine Constabulary alone was responsible for law enforcement on a national level. Independent city and municipal police forces took charge of maintaining peace and order on a local level, calling on the constabulary for aid when the need arose. The National Police Commission, established in 1966 to improve the professionalism and training of local police, had loose supervisory authority over the police. It was widely accepted, however, that this system had several serious defects. Most noteworthy were jurisdictional limitations, lack of uniformity and coordination, disputes between police forces, and partisan political involvement in police employment, appointments, assignments, and promotions. Local political bosses routinely used police as private armies, protecting their personal interests and intimidating political opponents.
In order to correct such deficiencies, the 1973 constitution provided for the integration of public safety forces. Several presidential decrees were subsequently issued, integrating the police, fire, and jail services in the nation's more than 1,500 cities and municipalities. On August 8, 1975, Presidential Decree 765 officially established the joint command structure of the Philippine Constabulary and Integrated National Police. The constabulary, which had a well-developed nationwide command and staff structure, was given the task of organizing the integration. The chief of the Philippine Constabulary served jointly as the director general of the Integrated National Police. As constabulary commander, he reported through the military chain of command, and as head of the Integrated National Police, he reported directly to the minister (later secretary) of national defense. The National Police Commission was transferred to the Ministry (later Department) of National Defense, retaining its oversight responsibilities but turning over authority for training and other matters to the Philippine Constabulary and Integrated National Police.
The Integrated National Police was assigned responsibility for public safety, protection of lives and property, enforcement of laws, and maintenance of peace and order throughout the nation. To carry out these responsibilities, it was given powers "to prevent crimes, effect the arrest of criminal offenders and provide for their detention and rehabilitation, prevent and control fires, investigate the commission of all crimes and offenses, bring the offenders to justice, and take all necessary steps to ensure public safety." In practice, the Philippine Constabulary retained responsibility for dealing with serious crimes or cases involving jurisdictions far separated from one another, and the Integrated National Police took charge of less serious crimes and local traffic, crime prevention, and public safety.
The Integrated National Police's organization paralleled that of the constabulary. The thirteen Philippine Constabulary regional command headquarters were the nuclei for the Integrated National Police's regional commands. Likewise, the constabulary's seventy-three provincial commanders, in their capacity as provincial police superintendents, had operational control of Integrated National Police forces in their respective provinces. Provinces were further subdivided into 147 police districts, stations, and substations. The constabulary was responsible for patrolling remote rural areas. In Metro Manila's four cities and thirteen municipalities, the Integrated National Police's Metropolitan Police Force shared the headquarters of the constabulary's Capital Command. The commanding general of the Capital Command was also the director of the Integrated National Police's Metropolitan Police Force and directed the operations of the capital's four police and fire districts.
As of 1985, the Integrated National Police numbered some 60,000 people, a marked increase over the 1980 figure of 51,000. Approximately 10 percent of these staff were fire and prison officials, and the remainder were police. The Philippine National Police Academy provided training for Integrated National Police officer cadets. Established under the Integrated National Police's Training Command in 1978, the academy offered a bachelor of science degree in public safety following a two-year course of study. Admission to the school was highly competitive.
Integrated National Police was the subject of some criticism, and the repeated object of reform. Police were accused of involvement in illegal activities, violent acts and abuse. Charges of corruption were frequent. To correct the Integrated National Police's image problem, the government sponsored programs to identify and punish police offenders, and training designed to raise their standard of appearance, conduct, and performance.
Dramatic changes were planned for the police in 1991. The newly formed Philippine National Police was to be a strictly civilian organization, removed from the armed forces and placed under a new civilian department known as the Department of the Interior and Local Government.
Local police forces were supported at the national level by the National Bureau of Investigation. As an agency of the Department of Justice, the National Bureau of Investigation was authorized to "investigate, on its own initiative and in the public interest, crimes and other offenses against the laws of the Philippines; to help whenever officially requested, investigate or detect crimes or other offenses; (and) to act as a national clearing house of criminal records and other information." In addition, the bureau maintained a scientific crime laboratory and provided technical assistance on request to the police and constabulary.
Local officials also played a role in law enforcement. By presidential decree, the justice system in the barangays empowered village leaders to handle petty and less serious crimes. The intent of the program was to reinforce the authority of local officials and to reduce the workload on already overtaxed Philippine law enforcement agencies. ()
Two studies were concerned with the needs of data creators and the responsibility for archiving of such data. In the traditional area of publishing it is quite clear where the responsibility for maintaining an archive of published information lies: publishers do not regard it as residing with them, and if libraries wish to preserve the books or journals they have bought, then it is their responsibility to do so. In electronic publishing the issues are not nearly as clear. In many cases, for example, libraries do not hold the database that resides with the publisher. (1997) recommended that a national body be established in the UK to coordinate such archiving and that it should be funded from the public sector, with an extension of legal deposit legislation to cover electronic publications. As far as unpublished data are concerned, universities and the funding agencies which support scholarly research are major sponsors of digital resource creation and, therefore, have a responsibility for ensuring that the research they help to create is preserved on a long-term basis. (1998) sought to establish how much of these digital resources were being created, as well as the level of provision which is being made for their preservation. The report also considered what the future needs of these bodies were with regard to digital preservation
A further study (, 1998) produced a strategic policy
framework, which examined how different organizations are approaching the key stages in the life-cycle of digital resources, from creation, through access to preservation. Finally, the question of post-hoc rescue or `digital archaeology' was
addressed ( 1998). Some data appear to be inaccessible due to the obsolescence of the hardware or software required to read them. The study examined approaches to accessing digital materials where the media have become damaged, through disaster or age, or where the hardware or software is either no longer available or unknown. It illustrated some methods of recovery, showing that most data can be rescued, if there is enough time and money, but emphasizing that the value of the data must be weighed against the cost of recovery.
A clear message that emerged from the studies was that a great deal of money can be wasted if digitization projects are undertaken without due regard to the long-term preservation of the digital files. It is relatively easy to produce a digital version of a book, manuscript or museum object. Unfortunately it is also easy to do so in such a way that either the long-term preservation of the file becomes expensive, because of the way it was created, or with the result that the work will have to be repeated because no plan was in place for archiving the file.
But digital preservation is about much more than digitizing to facilitate the preservation of items which were originally produced in a different medium. The preservation of digital materials which were created in the digital domain provides an even greater challenge, since there is no opportunity to return to the non-digital original. As the world moves increasingly towards dealing with this `born-digital' information, the potentially devastating impact on the future of scholarship increases, and so must the sense of duty to solve these problems
(1989) determined the impact of computer assisted cataloging on the archival staff and their performance in the 21 Illinois community agencies identified as having computer assisted cataloging. The focus of the study was on the volume of services, quality of catalog cards, and cataloging functions, attitudes toward the job, personnel changes, and environmental changes since the advent of computer-assisted cataloging. The major findings of the study are in two categories: duties and perceptions. The duties included increased interarchival loan transactions, fewer workshops, a decrease in turn-around time for cataloging and card production, a decrease in original cataloging, and a decrease in volume of services since the advent of computer assisted cataloging.
The perceptions included the cataloger’s feelings that computer assisted cataloging was an improvement over previous cataloging procedures and more positive than negative attitudes were expressed toward the job. The only environmental change consisted of the addition of furniture.
Once, paper-based technologies ruled; to date, libraries actively promoted electronic information tools whether simply via online catalogues or through the more sophisticated CD-ROMs or international networks. With these electronic information tools, archivists could choose whether they would subscribe to an expensive journal or provide access to it via the equivalent full text database online, or just possibly charge the requestor for the service.
(1996) determined the importance of computers in providing access to information resources both within and without library walls, He stated that the new technology is hard to resist, but it is essential that the needs if the ordinary clients continue to be met by providing access to the technology and resources in a central, public location, and making available the necessary instruction to use these media just as readers assistance has been providing for traditional print based media in the past. Furthermore, he emphasized that services provided should be examined closely and that policy decisions be taken to ensure appropriate dissemination of information resources and the ability to pay. Facilitating access to information, according to him, is of greatest importance.
(1998) conducted a study on the development, organization, use and maintenance of nonbook resources in the Mapua Institute of Technology Library. The researcher utilized the descriptive method. Two sets of questionnaires were utilized, one for the 13 regular full time librarians and another for the 53 regular full time faculty members. To supplement the information drawn out from the questionnaires, personal interviews of the respondents and other librarians were conducted and content or documentary analysis was made.
The study found out that there was no written procedural manual for nonbook materials in the Mapua Institute of Technology library with regard to acquisition, organization, use and maintenance. The study noted that one hundred percent of the respondents recommended the development of the procedural manual,
(1991) determined how well the UPLB library card catalog was performing, what its deficiencies are, and how well its level of effectiveness can be increased. It was found out that the subject catalog was more useful in locating materials than the Author/Title catalogs.
(1978) conducted an experimental study on the effectiveness of the National Union catalog and the Marcfiche as searching tools in the cataloging section of the Rizal library of the Ateneo de Manila University. The results of the study indicated that there was no significant difference in the time spent in searching the sample in either catalog, but there was a significant difference in the number of successful searcher by searching at the Marcfiche.
The experimental study showed that successful search in the NUC (National Union Catalog) was inhibited by: 1) title entries, 2) corporate authors, 3) foreign names of author entries, 4) see reference entries that are overlooked in searching because of their typography, and 5) the exhausting manipulation of the series of book catalogs.
On the other hand, successful search in the Marcfiche was inhibited by factors such as: 1) its title index being in sequential arrangement, and 2) its being a new searching tool so that searchers have to be more familiar with its many features to be able to use it at the optimum level. However, the number of successful searches in Marcfiche was affected by the independent variables such as: 1) its multiple access points through the multiple indexes, 2) ease in accessing the title index since the title of a work is easier to determine than its main entry, and 3) its being in Marcfiche form, makes easier to handle or manipulate them than a searching tool in book form and in a series of volumes like the NUC.
(1993) conducted a study on the state of selection process or weeding in selected archives in Metro Manila. The study sought to find out the nature and extent of weeding, the existence of deselection policies, the deselection methods and criteria used, the methods of disposing the discarded stock and the problems encountered by the archivists in implementing the weeding process.
Based on the findings and conclusions of the study the author recommended the following: 1) implement continuous and regular weeding process at least once every two years; 2) emphasize weeding of less-used materials by transferring/relocating them in a stock room rather totally discard them in as much as all are related to the curriculum; 3) for archives with no weeding policies yet, formulate sound guidelines and perform deselection process in the light of these policies, and 4) to solve the deselection problems encountered by the archivists, awareness of the importance of weeding through lectures or seminars should be disseminated to the archivists.