Begin With Discovery
The process of software innovation begins with a Discovery Session, exploring challenges in depth, looking for possibilities and value well beyond the all too common incremental improvements enabled by introducing digital technologies.
The discovery session is followed by strategy; crafting plans to iteratively prototype software to deliver early value, validate concepts and inform the way forward
The Collaboration Model provides a conceptual diagram of a typical software collaboration. Initial prototype software can be applied to real data by a client expert, exploring new workflows while preserving confidentiality and security.
As work matures, scaling to enterprise requires an “enabling (technology) toolkit”, which Enthought has developed on multiple projects across various industries.
Example Collaboration Model
Application Transforms Chip Subsystem Testing
The Freescale High Speed Signal Integrity (HSSI) group is responsible for testing the electrical characteristics of their chip communications subsystems, for example noise toleration or bit-error rate. Most of the test equipment is computer-controllable via a semi-standard protocol, and users would like to pause, stop, and reorder jobs to accommodate changing schedules.
Problems of data organization and sharing, and enabling lab automation are frequent in science. Collaborating with Enthought, a system was developed where test results can be viewed in the form of tables and plots according to flexible user-defined queries. The user can now archive data centrally for shared viewing. The test script can return partial results to monitor tests of long duration. Test automation is now enabled.
Freescale needed a toolset for scripting their tests, queueing up jobs, and storing the information in a central database. The data requires flexible reporting and visualization to allow the user to diagnose problems, tied to the script that generated them. Shown is a screen where users select a device to test.
Automating Sequence Stratigraphy
Ben Lasscock, Technical Lead for Energy Solutions, discusses a deep learning tool created to automate sequence stratigraphy (the analysis of seismic images for deposition patterns of sediments). Seismic interpretation requires the repetitive application of pattern and texture recognition of seismic images, informed by the geologic understanding of a skilled interpreter.
The problem of image segmentation and training data creation is common throughout science. Client experts provided guidance to Enthought throughout the development. The result was a cloud-based labelling toolkit which operates like a digital lab book, with every test recordable and reproducible.
Ben Lasscock | 4m | Using AI/Machine/Deep Learning to Automate Seismic Interpretation
Visualization & Analysis Tools
In oil & gas developments, thin sections (microscopic images) of reservoir rock provide the closest examination of in-situ properties, essential for accurate characterization and reserves estimates.
A data processing pipeline was created for reading raw image data directly from the microscope output, inputting the data into trained models to segment and identify individual grains and porosity.
Shown here is customized software created to provide an intuitive interface to visualize and navigate the multidimensional image stacks. An expert labels individual grains, which are then used to train deep learning models for automated classification.
Scientific Software and Python
Recurring Challenges Solved, Freeing Experts to Innovate
These are a number of problems common to multiple industries being solved using Python, in which Enthought provides training. Among them:
- Data Wrangling and Consolidation: Consistent, accessible, shared, machine readable, and located to enable efficient computations (e.g. cloud-based).
- Visualization and Image Analysis: Efficiently accessed and integrated data to generate insights to inform and prioritize deeper analysis.
- Modeling and Simulation: Computational models to replace or guide physical experiments while generating significantly more data to enable AI/Machine Learning.
- Predictive Tools (AI/Machine Learning): Computational models to remove drudgery from the work of experts and increase the quality and consistency of results.
Meet the Enthought Experts
Over 65% of Enthoughter’s hold Ph.D.s, and many are leading experts in their respective academic fields. Their domain expertise runs deep, and all have a passion for solving scientific problems.
Enthoughter’s achievements are many, including founding SciPy in 2002, founding the HDF5 for Python (h5py) software project, co-creating OUQ theory, and creating and maintaining wxPython, a cross-platform GUI toolkit.
Chairman & CEO
Director, Platform Development
Ph.D., Plasma Physics
Director, Software Architecture
Director, Algorithms and Machine Learning
Ph.D., Theoretical Physics
Manager, Build System
Ph.D., Electronic Engineering
Senior Scientific Software Developer
B.S., Computer Science