GCP IONS Meeting, Petaluma, 2002

Working Groups

The first meeting of the people most deeply engaged in analysis, programming, and development of the GCP was held in Petaluma, California from June 14 to 17, 2002. Following the Fournier dictum, we did not expect to do things at the meeting, but to meet each other, and then to go off in small groups for the hands-on work. Here are some of the main items we talked about, with some names of people who said they would like to do this work or who may be likely contributors. Take a look, and if you see something you would like to work on, get in touch with the person listed first, who will be the organizer or group leader. If you want to work on something here and don't have an address, send a note to Roger Nelson.

  • 1. Analysis, Reanalysis

    Dean Radin, Roger Nelson, Richard Broughton, Peter Bancel

    First identify a set of analyses that make sense and cover all the independent perspectives. Consider implications of suite analysis for hypothesis testing. Apply to all previously identified events in the formal table (and maybe the major explorations). Partial list: Mean Deviation, Variance, Autocorrelation, Inter-egg Correlation, Time Series tools, Primary and secondary variables, Optimal binning. Development of automated suite that can run on the Server, based on Eggshell. Data mining engine. Consider resampling. Remember the pseudorandom clone analysis. Compare theoretical and empirical variance estimates.

  • 2. Meta Papers

    Peter Bancel, Roger Nelson

    Technical implementation, the papers nobody wants to write, on the deep background and details of hardware, software, environmental conditions, calibration, security, database management, documentation. Many practical tests and calibrations should be done in an organized way. Aim is to have ready answer to all questions about the details of the project, and basic supporting material for communication in mainstream scientific channels.

  • 3. Website Split

    Roger Nelson, Peter Bancel, Rick Berger

    The website now combines the science and aesthetics with no facility for extracting or presenting the hard-edged material alone. The plan is to split off a "Science only" branch, where we can suggest serious conservative people go to get the basics. The formal analyses and contextual explorations will be segregated and unambiguously identified. The Meta Papers will be a model part of the content, and will provide primary, authoritative source material for anyone wishing to study the solid, scientific evidence for any extraordinary claims we make. The aesthetic side of the site may also present the science bits, but in a setting that is more free to treat interpretations, implications, inferences, and meaning. Here the data can be considered a source for modulations of ideas, as raw material for visual displays, music, and poetry of all sorts, as fair game for creative exploratory analysis and synthesis.

  • 4. Applications

    Joe Giove, Tom Sawyer, Peter Bancel, Roger Nelson

    We may have more work to do in the science domain, but for some purposes the GCP work already can serve as foundation or complement. What are the experiments now telling us? What should be done next to make applications viable and effective. What needs to be done better? What non-psi experiments might be useful? How about a Global Biofeedback Monitor (GBM) as Joe suggests, possibly as a separate network? Will our effects be more reliable or larger if there is public expectation? What are the scientific concerns for moving forward with a GBM? What are the potential applications? Commercial considerations: ownership, safeguards, liabilities, etc. Cost estimates for replicating the infrastructure; budgeting for maintenance and support, etc.

  • 5. Data Specification and Extraction

    Dick Bierman, Greg Nelson

    Specification of dataset, beginning and end of segments, binning, parameters, analysis algorithms. This could be a thesis topic for a computer science graduate student: creating a processing language to understand the complex of variables, the network, multimedia optimization. Link to Eggshell group

  • 6. Correlation

    Doug Mast, York Dobyns, James Spottiswoode, Peter Bancel?, Dick Bierman?

    Comprehensive inter-egg correlation over large amounts of data. The idea is that if there is some source or influence that operates on many eggs at a given time, even though it cannot be identified, there should be a composite excess of correlation. Needs thinking about blocking or binning, efficient algorithms, independence of components, and more.

  • 7. Eggshell: Multi-platform Compiling, Interface

    John Graham, Dean Radin, John Walker

    Adaptation of software suite to multiple platforms. Examination and testing of Eggshell to see what still is needed. Development of user-friendly interface for a set of default applications. Publish a CD with source and pre-compiled builds, and the auxiliary files such as rotten_eggs.csv and location.csv. Develop procedure and tools for updating the aux.csv files and algorithms.

  • 8. Hardware Issues

    John Graham, Dick Shoup, Rick Berger, Ed Lantz

    Testing vulnerability to low, non-spec supply voltage, other environmental impacts like temperature and EM fields. Deeper calibration. Development of protective and preventative strategies. Clean startup and shutdown under computer or power failure. Assessment of error capture, filtering of bad data.

  • 9. Growing, Changing Network

    Dick Bierman, Paul Bethke, Dick Shoup, Charles Ostman, Lee Klinger?, Michael Breland

    Considerations for optimal size of the EGG network, and composition. New egg types, including the Potato Chip, new technologies like single photon capture, biochips. Implications, costs and benefits of larger network, distribution, duplication. Multiple networks, Intel onboard REG sources, one-room EGG. Muse on any and all major changes. Build and test a Potato Chip (biochip REG). If we can do something, should we?

  • 10. Publishing and Outreach

    Roger Nelson, Peter Bancel, Dean Radin, Ralph Abraham? Marilyn Schlitz? Cheryl Haley?

    We want to invite others to look at the anomalies. Journals like IEEE, and Statistical Science. Road show getting invitations via our networks of mainstream friends, to give colloquia and seminars. Grounded in gruntwork.

  • 11. Data Camp

    Peter Bancel, Marilyn Schlitz

    Undergraduate and Graduate Students. Concentrated sessions of a couple of weeks with us as Faculty. Our data and tools as material for intensive education in sophisticated statistical and analytical work. "If you don't like statistics, don't come."

  • 12. Funding

    Marilyn Schlitz, Roger Nelson, others

    Small grants as seed support for works in progress, preparation of technical reports, meta-papers, mainstream submissions. Support for larger grant preparation, targeting e.g. Bial, NSF, International Collaboration RFAs mentioned by Marilyn. Immediate needs: brief descriptions, like one page. Think first in terms of 5-10K chunks for projects, pending several pieces of the polished work we talked about and perhaps supporting it, e.g., there was some discussion of aiming higher, to look for ongoing funding adequate to support some part or full time jobs. E.g., Peter, and possibly an assistant or two, etc. It is important to be sure that if we aim for money to support needed developments, we work to maintain the extraordinarily valuable from-the-heart type of involvement that has brought us this far.

  • 13. Theory

    Charles Ostman, Frank Sudia, Ed Lantz, others

    We need to have some thinking about a broader context, even though it may be premature to try to explain the correlations we see in the data.

  • 14. Art

    Roger Nelson, Tom Sawyer, others

    Pleasing displays with colors, 3-D plotting, localization, musical tones, modulations of graphics, ... based on the flow of data.

GCP Home