ALCTS Technical Services Directors of Large Research Libraries Discussion Group (Big Heads)

June 14, 2002, 9:30 a.m. -–12:30 p.m.
Atlanta, GA
Westin Peachtree Plaza Hotel, Augusta Room

Recorded by Judith Hopkins, University at Buffalo

For the text of the Round Robin on issues of concern to these institutions, which was distributed via the Big Heads electronic discussion list in the weeks prior to the Atlanta meeting see


  1. Welcome, Introductions, Announcements   (5 minutes)

  2. Election of Vice chair/Chair-elect (5 minutes)

  3. California's Mellon-funded e-journal study: When do end users prefer digital content and when do they prefer hardcopy? - Brian Schottlaender (20 minutes)

  4. Information exchange on issues of long-term preservation of digital resources and the role of the Library in digital preservation efforts - Judi Nadler (20 minutes) (Note: This topic includes the organization of digital and preservation initiatives in our libraries. Do most of us combine them in one unit or are they separate? How are we organized for digital initiatives? Are these initiatives housed in technical services? Who creates the metadata - catalogers or others?)

  5. University of Washington Licensing Metadata project/DLF NISO metadata meeting - Jim Stickman (15 minutes)

  6. Break (15 minutes)

  7. Cataloging e-resources - Issues and problems - Bob Wolven (20 minutes) (See also item 12)

  8. Publications Pattern Initiative - Sally Sinn and Jean Hirons (20 minutes) (see for information on the patterns initiative)

  9. OCLC's new interface migration plans - Glenn Patton will provide a brief overview/update on behalf of OCLC.(20 minutes) Questions for group: What plans are we making to implement the interface? How do we think the new interface changes will impact our operations? What additional information do we and other libraries need?

  10. Functional Requirements of Bibliographic Records (FRBR): What is it and what does it mean for our operations? - Glenn Patton (20 minutes)

  11. Training catalogers - please see Carol Hixson/Jean Hirons' white paper at (20 minutes)

  12. Other


Bob Wolven (Columbia) Armanda Barone (UC-Berkeley)
Scott Wicks (Cornell) Cindy Shelton (UCLA)
Deborah Jakubs (Duke) Judith Nadler (U of Chicago)
Jeff Horrell (Harvard) Barb Henigman (U of Illinois-UC)
Beacher Wiggins (LC) Leighann Ayers (U of Michigan)
Sally Sinn (NAL) Barb Stelmasik (U of Minnesota)
Duane Arenales (NLM) Larry Alford (UNC-Chapel Hill)
Cynthia Clark (NYPL) Carton Rogers (U of Pennsylvania)
Arno Kastner (NYU) Robin Fradenburgh (U of Texas)
Trisha Davis (OSU) Beth Picknally Camden (U of Va)
Rosann Bazirjian (Penn State) Jim Stickman (U of Washington)
Katharine T. Farrell (Princeton) Irene Zimmerman (U of Wisconsin)
Catherine Tierney (Stanford) Ann Okerson (Yale)

Brian Schottlaender (UC-San Diego) Glenn Patton (OCLC)
Jean Hirons (CONSER Coordinator, LC)

  1. Welcome, introductions and announcements (Larry Alford, Chair)

    Chair Larry Alford welcomed the group and paid tribute to Judith Hopkins’ volunteer work of preparing the minutes and placing them and the round robin reports on the web.

  2. Election of Vice-Chair/Chair-Elect:

    Arno Kastner (NYU) was elected unanimously.

  3. Update on University of California’s Mellon-fund ejournal study.

    When do end users prefer digital content and when do they prefer hard copy?  Brian Schottlaender said the study was collecting empirical data on faculty and students’ use of digital journals.   The University of California System had developed criteria to develop a multi-institutional project and then asked the Mellon Foundation to fund it. They now have 6 months of data (he distributed a spreadsheet showing results to date, October 2001 – March 2002).  He urged that these preliminary results be accepted with caution.  UC plans to publish a mid-project technical report in College and Research Libraries and a more philosophical report in Portal. The campuses of the University of California have a stable if not deteriorating infrastructure; libraries are crowded. There is a fairly substantial set of electronic journals on all the system’s campuses, about 7,000 at the beginning of the project; now the number is 50 percent larger.  The system also has two storage facilities, one in northern California and the other in the southern part of the state.  The southern one can hold 11,000,000 volumes yet it is beginning to be filled up.   There is also unplanned redundancy between the two facilities.

    Approximately 300 journals are being used in the study (five per cent of the  corpus) for each of which at least 2 print copies were available in the system. The control campuses made material available as usual while at the experimental sites physical copies were removed to remote storage and users were asked to depend only on the electronic version unless they specifically expressed a need for the print version.  UC used the following criteria to choose sample journals: availability of digital use data from the publisher; a mix of journals for some of which current issues were available in both forms and for some of which only electronic current issues were available; journals in various disciplines; journals with various physical characteristics such as lots of graphics, text in various languages, and articles with various lengths.    Forty per cent of the sample was from the physical sciences, forty per cent from the life sciences, ten per cent from the social sciences, and ten per cent from the humanities.  The research objectives are to discover the factors determining acceptability of digital over print in the journals themselves, in the characteristics of the users, and in the users’ technology environment.  Another objective is to see whether or not the purpose for which a journal is used determines the acceptability of the digital version. They will gather six more months of quantitative data and will also soon start interviewing users to get a sense of their attitudes; this latter process is expected to last for the remainder of the year 2002.

    Tentative quantitative conclusions. 

    1. Print use is higher when print is on site. 
    2. Print use is very low whether or not the print version is shelved on site;
    3. Digital use is one or two orders of magnitude higher than print use, whether or not a print version is available. 
      The findings are constant and stable across all disciplines.

    Tentative qualitative conclusions: Content isn’t always available in digital form. There are three reasons for this:

    1. A curatorial decision to omit matter such as advertisements, preliminary matter, indexes, etc. 
    2. Material is absent  not by curatorial decision but capriciously. 
    3. Publisher takes content down.
    There are two types of omitted data:
    1. Publishers stop making a particular title available electronically, or
    2. They remove content from a particular title.

    These findings are making the University of California re-think licensing agreements.

    Sally Sinn (NAL) asked if licensing arrangements let them know of omissions or if users let them know.  Brian responded that the information came from users.  Bob Wolven (Columbia) asked if the designation of experimental versus control campuses was by campus or title by title.  It is title by title.  Some institutions made it known that certain titles were being taken off shelves, others didn’t. In response to a question from Ann Okerson (Yale) Brian said you might be safe in making only electronic available to users as long as there was a print backup somewhere (not necessarily locally).  Judi Nadler (Chicago) asked how he thought the findings of the Outsell survey matched UC’s findings?  She commented that faculty like to find things online and then to use them in print.  Brian said their preliminary results paralleled that of the Outsell data; people want to use library data but not to come to the library.  Larry Alford commented that at the University of North Carolina at Chapel Hill visitor data is higher than ever.  Brian quoted someone as saying people are coming to the library but not to see us!

    Someone asked about the conclusions he is drawing about continuing availability of print; how long will publishers keep printing?  Rosann Bazirjian (Penn State) asked if he could categorize the type of users who asked for material from storage: his impression was that it was largely faculty-driven, chiefly faculty from the social sciences who wanted students to study publications qua publications.

  4. Information exchange on issues of long-term preservation of digital resources and the role of the Library in digital preservation efforts -

    Judi Nadler (University of Chicago) provided a framework for discussion.  The challenge she presented was as follows.

    As custodians of cultural heritage, libraries have served the role of repositories of traditional research resources and have been entrusted with their accessibility and long-term care. Extending from traditional resources to also include resources in electronic form, and assuming responsibility for creating, converting, and acquiring such resources, libraries have de facto also assumed the responsibility for providing access to and ensuring maintenance of these resources over time.

    Discovery metadata (cataloging) and traditional means of preservation ensure access to and long-term life of paper resources. The inherent fragility of digital resources requires more attention, often much sooner than resources on paper. Also, in addition to discovery metadata, preservation metadata (technical, administrative, structural, and other) is required for the maintenance and long-term usability of electronic resources. Decisions regarding preservation metadata and the creation thereof must take place up front, as early on in the process as possible.

    Standards for preservation metadata are still evolving. Understanding, documenting, and following the standards is crucial for future interoperability.

    The relation between digital preservation and property rights is not clear yet, and sources of expertise must be identified and nurtured.

    The investment in digital resources is high and the layout of related organizational structures varies. The more decentralized the responsibilities, the more the need for ongoing communication and a shared decision-making process.

    Ms. Nadler suggested these issues for information exchange. How equipped are we to?

    1. Develop a library knowledge base on issues of long-term care of digital resources
    2. Foster awareness and monitor development of standards
    3. Assess parameters of scope and scale -- the level of resource commitment the library can make to the long-term care of its digital resources (converted, created, acquired)
      • Selective?
      • Comprehensive?
      • Project based?
      • Program level?
    4. Assess breadth of commitment the library wants to / is expected to make.
      Library? University?   Are we equipped for this?
    5. Decide on role of Library as repository for University? As advisory to University?
    6. Explore models
      • Build local repository?
      • Cooperate with others? (consortial approach)
      • Identify and use trusted repositories?
      • Combine some or all of the above?
    7. Explore and weigh options for technical strategies
    8. Develop and implement models for organizational structures that best support the various aspects of our digital activities

    How are we set up to support these activities? Catherine Tierney (Stanford) asked if there is something in particular about these things that are different enough to make us approach things differently?  Judi Nadler replied that the difference is one of scale; selection is an issue as well, what content should be preserved? What does preservation mean in this context?  Jeffrey Horrell said that Harvard has been thinking about the distinction between enduring collections and ephemeral collections.   Some things move between these two.  Different criteria and assumptions are made for each type. See the following website ( for information on the Harvard Library Digital Initiative.  Judi Nadler said the criterion they would like to promote at Chicgo is: the materials they are responsible for creating are those they are responsible for maintaining.  Duane Arenales (NLM) said there are legal issues; for materials born digital decisions cannot be delayed as long as they can for materials born print.  One has to work to determine that born-digital items are complete and are in a form that can be transferable to other formats.  Creators need a way to make it known that they are taking responsibility for maintaining such items and keeping them up-to-date.  NLM is using an addition to the Dublin Core for their own publications. Larry Alford said that the University of North Carolina at Chapel Hill (UNC-CH) is creating catalog records for each item they are keeping (about 1200 so far in OCLC).  Bob Wolven (Columbia) referred to efforts of the Digital Library Federation and OCLC to maintain registries but added that we still don’t have organizational models.  Jeffrey Horrell (Harvard) said a number of us have been working with the Mellon Foundation. The economic and publisher/licensee issues are among the most complicated to develop models for.  Sally Sinn (NAL) asked how many are directly engaged in working with their institutions to do something about practices for digitizing materials.   Beacher Wiggins said LC has gotten a hundred million dollars to look into this and to bring the many players together. (The grant will require matching funds (either in cash or material)). LC is using the first 5 million of the grant to make contact with others: publishers, users, etc.    LC doesn’t have anything to share yet but hopes to have a more comprehensive plan by the time of the Big Heads midwinter meeting in Philadelphia in January 2003.

    Judi Nadler listed some of the types of help the University of Chicago would be willing to contribute.  Cynthia Shelton (UCLA) said the California Digital Library is taking the lead in California to develop a model and to make decisions.   Ann Okerson (Yale) commented on the great variety of knowledge needed to make these decisions: university computing, library systems, catalogers, preservation, etc., all will need to be involved. She thought we do not consider end users enough and urged that the Mellon Foundation support a study of user needs in various disciplines. Publishers also need to be involved because we need them to produce material in formats that will be used.  Judi  Nadler asked if there would be time for discussion of organizational aspects?  Larry Alford (UNC-CH), the chair of the meeting, suggested doing so over lunch.  Duane Arenales (NLM) commented that one difference between digital resources and traditional library resources is the dynamic nature of this material; there are differences in digital publications over time, in fact the Washington Post online shows differences within one day.  It was concluded that this is a topic we may want to continue to monitor and exchange information on.

  5. University of Washington Licensing Metadata project/DLF NISO metadata meeting - Jim Stickman

    E-Resource Management Metadata and Systems

    Jim Stickman reported on continuing efforts by Tim Jewell, Head, Collection Management Services, University of Washington, and others to inventory data elements and functions in emerging systems that help librarians manage licensing and support of electronic resources.

    At the last ALA Midwinter meeting Tim led a discussion sponsored by Big Heads that attracted some 40 librarians and led to further discussions of functions and data elements.  Following this meeting, an informal steering group was formed that included Tim, Ivy Anderson (Harvard), Adam Chandler (Cornell), Sharon Farb (UCLA), Kim Parker (Yale), and Nathan Robertson (Johns Hopkins). This group worked with Pat Harris and Priscilla Caplan (NISO) and Dan Greenstein (then at DLF) to conduct a successful Workshop on Standards for Electronic Resource Management  at a Digital Library Federation (DLF) meeting on May 10th  attended by 50 librarians and representatives from a number of vendors and publishers.

    One outcome of the workshop was general agreement that standards would be helpful to all parties. Subsequent discussions among the meeting organizers identified two complementary “tracks” for follow-up work to be undertaken in the near future. One track would aim at the development of a general “functional specification/best practice” document, while the other would focus on the areas where data is most likely to be exchanged over time, and therefore be most likely to benefit from formal standardization.  The steering committee is developing a proposal to DLF requesting support for a project to foster the rapid development of improved tools for managing licensed e-resources – whether by individual libraries, consortia, or vendors.

    The steering committee is providing an update and leading another discussion at a meeting later in the Atlanta conference.  The steering committee seeks the continuing sponsorship of Big Heads through the annual meeting in Toronto (2003).

    More information is available at the Web Hub for Developing Administrative Metadata for Electronic Resource Management at:

    Jim noted that the University of Washington Libraries has recently begun working with Innovative Interfaces Inc. on the development of an electronic resources management module for the III automated library system.

    Judi Nadler (University of Chicago) thanked Mr. Stickman and the group for doing this work.  Digital tools for managing electronic resources are something many libraries need, hers included.  She asked how receptive are vendors to coming up with such tools.  Jim Stickman said that the vendor of the system that the University of Washington uses, Innovative Interfaces Inc. (III), is very interested in modifying that system to include more licensing data and to establish relationships among data, to provide better reporting and better catalog displays for use.  The University of Washington is putting together a draft list of elements and various scenarios for III use.  He said he hopes that other vendors will do something similar.    Judi Nadler asked if this would be a tool that would become an intrinsic part of III or something that could be used by other vendors?  He couldn’t answer but said he thought that III was thinking of the former.

  6. Break - (15 minutes)

  7. Cataloging e-resources - Issues and problems - Bob Wolven (20 minutes) (Because of time pressures this topic was skipped at this time and the group proceeded on to the next topic; Bob Wolven gave a brief report on this topic at the end of the meeting. See item 12)

  8. Publications Pattern Initiative - Sally Sinn and Jean Hirons (20 minutes) - (see for information on the patterns initiative)

    Sally Sinn (NAL), Chair of the CONSER Task Force on Publication Patterns and Holdings, summarized the aims of the initiative and its successes to date. She noted that we have the MARC21 holdings format and vendors profess to be compliant with it but not all pieces are yet together.  Suppose you are planning to move from original system A to vendor system B but the vendor cannot deal with bibliographic data from system A.  That is analogous to the need for the establishment of a national system of publication pattern data that can migrate as libraries change integrated library systems. 

    The CONSER Task Force on Publication Patterns and Holdings seeded the publication pattern database by taking 40,000 Harvard records and attaching them to bibliographic records in OCLC; OCLC created an 891 field (Publication Pattern Data) for the data.  What is the time and effort investment involved in this?  It is the first real implementation of the MARC21 Format for Holdings Data. 

    The TF is working to keep systems vendors involved.  What do vendors offer now when they profess to be compliant? How capable are they of being able to output data? How able are they to accept input data?  As of June 2002 the TF completed a two year pilot project to add patterns to the database, including four to five thousand records contributed by participants plus the Harvard records.  The TF has submitted proposals to MARBI to add new coding to the Holdings Format and have 2 vendors now able to make use of the patterns from the OCLC records.  The TF worked on the SCCTP (Serials Cataloging Cooperative Training Program) course on MARC holdings for serials and formulated a statement on what compliance with the holdings format is (see the CONSER website ( Libraries need to use existing standards and to put pressure on vendors to support standardized application of MARC holdings. 

    The CONSER Publication Pattern Initiative has been wedded to print versions but needs to relate to digital as well.  We may need to determine publisher intent for born digital materials.  We are in the infancy in what we need to do with publication pattern data. There is still much skepticism about the value of contributing to a database of publication patterns; the perception is that it incurs added cost and added processing time.  Sally Sinn said it is not an onerous addition. There is also a perception that there is no long term value to it.  If we found that all vendors had implemented a standard method of transferring holdings data just as bibliographic data is transferable, wouldn’t that be worth while? 

    The pilot project is ending now but the participants have agreed to continue contributing publication pattern data.  If you believe that this is a valuable effort, pressure your vendors to become compliant.  Jean Hirons (CONSER Coordinator, LC) put in a plea for more participation; the more records are input in OCLC the more pressure will be put on vendors.  Participants must be able to work on OCLC even if they are not CONSER members; they would be given CONSER Enhance status. Duane Arenales (NLM) asked about participation by subscription agents.  Jean Hirons said someone from EBSCO is part of the group but none are yet using the Holdings format.  Duane Arenales (NLM) said perhaps the format is too complicated for them. Judi Nadler (Chicago) said CONSER needs to provide cost figures for managers and to continue to focus on benefits.    Sally Sinn (NAL) asked how participation has been included in libraries’ work flow.  We need to decide where data should ultimately reside.   Perhaps we should put publication pattern data in a separate database linked to CONSER bibliographic records.  Someone commented that when you catalog you have only the 1st issue and the pattern is not clear. There needs to be a way for non-catalogers to update the record a year or so later.  Jean Hirons (LC) said use of the Bremer macro makes original input very easy; it takes only 2-3 minutes. 

  9. OCLC new interface migration plans - Glenn Patton provided a brief overview/update on behalf of OCLC.  (20 minutes)

    Glenn Patton (OCLC) distributed copies of the new OCLC brochure and said it is available on the OCLC website (  Also available is a functionality list ( The key points are:

    • The initial version of the browser interface will become available on June 30, 2002.  Once it is available you can try it out; all current authorizations and passwords will work. 
    • There is no software to download. 
    • Connexion contains all the current functionality of CORC, the CatExpress interface, and WebDewey and has been expanded to deal with all types of materials. 
    • The plan is for quarterly enhancement releases starting in fall 2002. 
    • There will also be a Windows client to Connexion, starting in the 2d quarter of 2003. 

How does the implementation of Connexion affect current interfaces to cataloging? Passport will work until the end of 2003; CatME will work for the foreseeable future.  Z39.50 access to cataloging will continue to be available.

The Connexion website has a functionality listing (10 pages) which allows you to compare what can be done in either Passport or CatME and the functionality available in the first version of the browser interface and Windows client.  Various migration paths are possible (see p. 3 of brochure) but they generally fall into one of three groups:

  The crucial factor is to examine your workflows in conjunction with the functionality list.  Other advice he offered is, the more complex your workflows, the more likely you will want to wait until the Windows client is available with its macros feature. Another factor related to workflow is that the Save files now available will not be same as the Save files in the browser interface; users will need to clean out Save files prior to migration.  

Someone asked whether we need to think of this as institutional migration vs individual migration?  Glenn said that depends on workflows.  Staff who do different types of work might be able to move at different times.  Someone asked when current CJK functionality will move?  Glenn said that current CJK and Arabic software will continue to function as they do until brought forward into the Windows interface; this involves Unicode implementation.  While OCLC does use some Unicode now for CJK and Arabic and parts of CORC, mid-2003 is a possible date for full implementation of Unicode.   Bob Wolven (Columbia) had 2 questions: First, would it be a fair statement to say that Passport would remain functional past Dec. 2003 if the need arose?  Glenn answered that the probable answer is yes.  2nd question. What are the plans for migration of internal units of OCLC that do cataloging, quality control, etc.?  Glenn said his staff has moved to CatME; he said some high volume activities will probably stay with Passport until the Windows client become available.

Cynthia Clark (NYPL) asked what kind of training will be provided or is Connexion expected to be intuitive.  Glenn said there will be a tutorial offered along with the Windows interface; regional networks will provide training.   Cynthia Shelton (UCLA) asked about pricing. Glenn said pricing models will continue to be the same as they are in Passport or CatME.

  • Functional Requirements of Bibliographic Records (FRBR): What is it and what does it mean for our operations?  Glenn Patton (20 minutes)

    The Functional Requirements for Bibliographic Records or FRBR (ISBN 3-598-11328-X) is the result of 6 years of work by an IFLA study group.  The process started with an international conference in Stockholm in 1990 that looked at how the world of bibliographic data had changed in the last half century: the growth of shared cataloging databases, the role of publishers and distributors in providing bibliographic data, the role of electronic publications, etc.  This led to the recognition that these kinds of changes were stretching traditional practices of cataloging.  While cataloging rules and practices had changed over years to accommodate new types of materials, this change had not been done in a principled way.   The rules were adjusted to deal with new situation rather than by establishing general principles. The IFLA study group was to look at what we do when we catalog, what kinds of information we record, how necessary that information is, etc. and to build a conceptual model of how bibliographic data works that could be used as the basis for a more principled look at cataloging use.  The study group identified four tasks that users of all types (including library staff) perform: 

    • To find entities that correspond to the user’s stated research criteria
    • To identify an entity
    • To select an entity appropriate to the user’s needs
    • To acquire or obtain access to the entity described 

    The FRBR model defines three groups of entities and describes the relationship among these groups of entities. 

    1. The products of intellectual or artistic endeavor (works, expressions, manifestations, items)
    2. Those responsible for the intellectual or artistic content (Person or corporate body)
    3. Those that serve as the subjects of intellectual or artistic endeavor (concept, object, event, and place; persons and works can also be the subjects of works)

    Past catalogs have tended to be flat sequences of individual items which didn’t show hierarchical relationships.  The potential for the FRBR model is to assist in developing the relationships to enable users to deal with smaller result set instead of having to deal with hundreds or thousands of bibliographic records. 

    Mr. Patton provided graphics which illustrated the 4 level FRBR model. The first was based on a single work, Shakespeare's Hamlet (a work) which was realized through various expressions (a French translation, and a German translation), each of which was embodied in various manifestations (Paris, 1946; Hamburg 1834), which were exemplified by different items (physical copies in different libraries)

    The second was for a set of related works, all based on the novel Show Boat by Edna Ferber, with a pun relating her last name to FRBR.   There was the novel itself (work), with a Polish translation (expression) and a specific edition of that translation (manifestation). There was also the 1936 motion picture directed by James Whale (work); and also the 1951 motion picture directed by George Sidney (work); and the Kern-Hammerstein musical of 1927 (work) with the latter realized through various expressions (a score for the vocal selections, a recording of selections, and the original cast recording of the 1946 revival with each of these three expressions being embodied in various manifestations (different publishers and dates)).

        Judi Nadler (Chicago) asked what, conceptually, is the difference between FRBR and MULVER (multiple versions project)?  In terms of the origin of FRBR Glenn didn’t think there was a direct relationship though some of the same people were involved in both projects. MULVER was interested in reproductions. Work is going on in the local system vendor community, particularly in Europe (funded by the European Union), on ways to organize and display records for users; OCLC plans to incorporate FRBR into its new version.  Judi Nadler commented that MULVER was highly regarded conceptually but fell apart because of the perceived difficulty of applying it. How is FRBR different?  Glenn responded that there is lots of interest from the vendor community and the cataloging rules community in FRBR; perhaps now is the time and MULVER was before its time. 

    Larry Alford (UNC-CH) said FRBR sounded like a powerful idea and model but that the implications for local operations seem pretty big.  He asked if Glenn had any ideas how we might do that?  Glenn said he didn’t though there is beginning to be a general movement to introducing the library community to FRBR, pointing to various programs at this conference.  He would like to hide it from catalogers as much as possible and let systems provide links or at least provide options among which catalogers could choose.  Cynthia Shelton (UCLA) said catalogers know this stuff intellectually; they know about collocating works which systems display by date or language.  Glenn said that was true to some degree but the experience at OCLC shows that there are lots of things we haven’t done as much in the past as we could have, e.g., use of uniform titles.  It would be hard to go back to supply this information for older things or even for some categories of things.  We have been more likely to use uniform titles for literary works rather than scientific works.  

    Duane Arenales (NLM) said she was curious as to what effect he thought the FRBR  approach would have on MARC?  Glenn said there is a discussion paper entitled “Dealing with FRBR expressions in MARC21” which MARBI will consider at its meeting on Saturday, June 15 (  He also mentioned Tom Delsey’s Functional Analysis of the MARC 21 Bibliographic and Holdings Formats ( which contains an analysis of MARC in relation to the FRBR model.   Do we need work and expression records in addition to manifestation records; can they be derived from existing manifestation records?  [For a research project on this latter topic cf. Dr. Edward O'Neill's presentation at the  CCS Cataloging and Classification Research Discussion Group on Saturday, June 15, 2002. Notes on this presentation can be found at] 

  • Training catalogers
    Larry Alford urged people to look at the Carol Hixson/Jean Hirons white paper at:

  • Larry Alford asked Bob Wolven (Columbia) to briefly summarize issues relating to e-resources for discussion at a future meeting.

    Mr. Wolven mentioned several developing approaches for e-resource cataloging, including new rules for integrating resources, use of e-journal management vendors to supply cataloging and holdings data, and a CONSER proposal for a new kind of one record approach to e-journals ( which has some relationship to FRBR. He noted that these issues overlap with several items on the agenda, and concluded by asking several questions.

    1. Is the CONSER proposal a good thing or not?
    2. Are our cataloging staffing, training and organizational models well suited to these developing approaches?
    3. What implications do these cataloging models have for our ability to support preservation and archiving of e-resources?

    The meeting was adjourned at 12:32 p.m.