This report describes the work done by Moon Express under subcontract to INTEK Advanced Communications (INTEK) to develop and test the ILO-X instrument prototype under its Phase 1 development plan. The overall goal of Phase 1 was to develop an ILO-X prototype instrument to PDR-level design, validated by hardware demonstration including integrated COTS camera, lens, hyper-spectral capability, support electronics, motorized telescope mount and controller API software accessible over the internet. The culmination of the work was the international demonstrations of the instrument with ILOA and several of their worldwide partners. ILO-X Observation & Communication Demonstration 1.0 took place on 21 July 2011 at the NASA Research Park in Mountain View, California, where participants controlled the instrument via the web to take images of several predetermined objects.
The ILO-X Instrument was built by adapting Commercial-Off-The-Shelf (COTS) components. The instrument development was an intricate process, starting with the construction of a test jig that allowed for the design constraints to be determined. Using these constraints, a housing was designed and manufactured out of aluminum to contain the lens / filter / CCD assembly.
The ILO-X instrument Phase 1 proof-of-concept incorporates a COTS Liquid Crystal Tunable Filter of type VariSpec VIS-20-352 from Cambridge Research Instruments between a catadioptric lens and a COTS astrophotography camera to create a programmable solid-state multispectral imaging system in the visible range with a 7nm bandwidth. Due to time constraints, hardware decisions were made early in the design process and did not change throughout the first phase of development.*
Software was developed to run a suite of operations with the end goal of creating a user interface that made operating the instrument an easy exercise even for a novice. The resulting software is able to control the instrument mount, filter and CCD. The graphical user interface allowed users to access the controls via a web interface and to control exposure times, filter settings, and objects to image. Upon taking an image, the user could see the image displayed on the web application.
*The hardware and design have since changed in Phases 2 and 3 to allow for use of more space-ready components and to maximize scientific / educational value of observations.