Virtual Cell Project blog


As a person grows older we reach a time in our lives where we question what is the real purpose of our lives. After many hours of self-reflection I have come to the conclusion my life purpose is to use my talents to help others fight cancer. Seeing how cancer affects so many humans and their families drives me crazy. As a human race, we can put a man on the moon, send craft to Mars and out of the solar system; however, we don’t seem to be able to cure cancer. There has to be a better way.

Now I am not a scientist, doctor, or biologist, however, I am a 33+ year experienced technology architect with experience in artificial intelligence, robotics, virtual reality, programming, and leadership. So I was thinking, how can I use my strengths to help others that can cure cancer do so. My solution; we need to build virtual cells using virtual reality and artificial intelligence so researchers can test various possible solutions and see the results faster.

Now others have tried this approach. Some institutes are currently attempting the same idea. But from my research, most of these are quagmired with internal politics and attempts to create standards verses action. We would need a team not hampered by such things. A team of action and deliverables.

As there are multiple types of cells in the human body, we would need to create multiple virtual ones. As all cells follow the laws of chemistry which in turn follow the laws of physics, these laws could be turned into a type of artificial intelligence (AI). Now HiFi is giving us a chance to make this a reality. I envision each component of a cell is a HiFi domain which follows the laws of chemistry. Then each human cell is another HiFi domain which each component receives feedback from its individual HiFi component domain. The component domain would also receive feedback from the cell domain. Now when a researcher introduces a particular molecule to a cell, they could see the results of that interaction in virtual real time.

Of course, this is a daunting undertaking and I may not live long enough to see it come to reality; however, we have to start somewhere. So I plead for anyone out there who shares this vision to join me in this effort. We would need skills from all areas. So no matter your skills, there is a place for them in this goal to cure cancer. You may not be a scientist or a doctor; however, you can help make a difference. Will you join me?

Scott Peal (aka @VR_Architect)

This thread will serve as our team’s blog.


So where to start…

We will need to simulate the interactions between atoms by using atomic physics. But first we must create a user interface where an element can be selected and placed in world with it’s associated properties such as size. But before we do that we need to learn how to program in HiFi :smile:

So the first thing is to determine a template to use for our scripts. I have came up with this so far and welcome feedback.

// Script: 	yourScriptNameHere.js
// Purpose: 	Script purpose here
// Project: 	Virtual Chemistry Application
// Author: 	Scott Peal aka @VR_Architect
// Copyright:	2015 Fortune Cookie Software LLC 
// License: 	Distributed under the Apache License, Version 2.0.
//  		See the accompanying file LICENSE or
// Updates: 	v.01 01JUN15 - Initial version

// Variables

// Script Start Function
function scriptStarting(){
	print("Script is STARTING");
	print("Script has completed STARTING");

// Script Update function
function scriptUpdating(deltaTime) {};

// Script Stopping function
function scriptStop(deltaTime) {  
	print("Script is STOPPING");
	print("Script has completed STOPPING");

// Script Events
function mousePressEvent(event) {};

// Register Events

// Register the update function

//  Register the ending function

print("Script RAN TO END (script may still be running though!)");

Now to figure out how to create a user interface. More to come…


The term used in HiFi for user interfaces (UI) is called Overlays. To create a UI element you use this code:

var myElement = Overlays.addOverlay("image", {
  x: 0,
  y: 0,
  width: 32,
  height: 32,
  imageURL: "yourPathToTheImage",
  color: {
    red: 255,
    green: 255,
    blue: 255
  alpha: 1

Where these properties apply:

  • x = the location on the screen left-to-right where 0 is the far left. Values: +/- decimals
  • y = the location on the screen top-to-bottom where 0 is the top. Values: +/- decimals
  • width: how wide is the UI element in pixels. Values: +/- decimals
  • height: how high is the UI element in pixels. Values: +/- decimals
  • imageURL: where can the image be found to show as the UI element
  • color: what color should the UI element be. Values: 0-255 RGB
  • alpha: what is the transparency of the UI element. Values: 0.25 to 1.0 (note that there is a bug where if you set a value above 1.0 it seems to use the value after the decimal so a value of 1.5 results in 0.5 showing).

The UI coordinates is the top left is 0,0. The screen width and height can be retrieved via this command:

var screenSize = Controller.getViewportDimensions();

Then reference the width with


And the height with



We’ll need a Message System to simulate Proteins, i guess. So Queues and publish/subscribe Topics to get Entities to communicate. There is something like that for Javascript in DOJO.


@Phineas.Click great idea! What is your vision?

I found a protein databank online so if we could figure out how to take their file format and convert to 3d would that help?


@VR_Architect : That was more or less just a ramdom thougt abaou what this Project might need on the infrastructure Side. Maybe it would be a good idea to define the scope of the Project.
Btw. you might be intersted in this :
A Whole-Cell Computational Model Predicts Phenotype from Genotype


Excellent paper and insightful for our 3D model. In the paper they seem to model at the molecular level. I agree with your observations and we need a scope. How is this for a draft?

Our scope is to model a healthy adult (not-pregnant) human female luminal epithelial cell at the atomic level. This cell is the most affected in breast cancer. When modeling at the molecular level, it is based upon observations and would limit undiscovered phenomenon. If we can model at the atomic level, the system will self-create molecules and their interactions. Of course molecular observations are very important as test cases to validate the model AI.


The initial element picker user interface is working. It detects which element is clicked and places a sphere of the correct atomic radius and color into the world. The Los Alamos National Laboratory’s periodic table was used for base data for element names, symbols, weight, and size. The open source Jmol molecular visualizer element colors was used for atom color.

Next steps:

  • Element Picker
    • Add text box so a user can type a chemical notation for a molecule or an element in to create
    • Add a close button
    • Add a border
    • Allow UI to be moved on the screen
  • Add a HiFi-like button to open and close the element picker
  • Model
    • Turn on mesh collisions
    • Add model properties for the element such as chemical notation, ionization, etc.
    • Create a mesh collision event to detect if two atoms touching would result in a chemical reaction and thus join
    • Space joined atoms apart per polarization rules


@philip - I have worked with many different game engines and simulation engines to attempt to create user interfaces / HUD’s. It is always extremely difficult, time consuming, and takes away from the core functions being focused on.

With High Fidelity, I was able to learn how to create a script and create a basic UI/HUD in about 1 hour. That sir, is incredible! This is a wonderful change and I thank you and your team very much for the fantastic experience I just had.


Very cool! Keep going with this, I’d love to see more chemistry/physics educational content.


In OpenSim I normally use UCSF Chimera ( ) to convert PDB to DAE format, then put it through MeshLab. What you get, of course, is a static mesh which may not be at all what you are after if you are modelling protein dynamics. However, if it helps, do bear in mind that there are also mesh models available from electron cryotomograms

Result (I was playing with ):

Apologies if you’re way ahead of this by now…

PS: Having seen Philip’s recent talk it looks like you are being influenced by Drew Berry’s work? I hasten to add that I’m very much an amateur at all this.
PPS: You can also generate small mesh molecules from PDB files directly in MeshLab (and most likely Jmol though I’ve not tried that).


Thanks for the insights. We will look into those awesome resources. We can use these also as test cases that the dynamically created model is the same as these.

I don’t know Drew Berry. I am influenced on my work with artificial intelligence and reached a conclusion that for full-AI to work properly it needs to simulate some human experiences as it relates to input sensors such as provided by the 5-senses and also positive/negative reinforcements. I believe virtual worlds could offer that simulation to the level required for AI to benefit from. If not we have to wait for robotics to get to the point it can be used.



Element Picker

  • Add text box so a user can type a chemical notation for a molecule or an element in to create
  • Add a close button - DONE
  • Add a border - DONE
  • Allow UI to be moved on the screen
  • Add a HiFi-like button to open and close the element picker
  • On model creation, attach element properties as userdata - DONE
  • On model creation, attach model event script


  • Turn on mesh collisions
  • Add model properties for the element such as chemical notation, ionization, etc.
  • Create a mesh collision event to detect if two atoms touching would result in a chemical reaction and thus join
  • Space joined atoms apart per polarization rules


An example of Drew’s visualization work:


WOW !!! I just watched the video. Excellent work.

Now if we can replicate the same end results by dynamically following the principles of atomic and molecular science, that might bring value to researchers.


I guess it’s worth pointing out that Drew’s visualizations are inevitably gross simplifications of what is actually taking place in the cell (Alan Kay got quite animated about it at a different TED talk). To give you an idea of what the cytoplasm looks like in the bacterium Escherichia coli, see (there’s a brief video towards the end).

From your approach I’m guessing that this project is complementary to the model Philip showed in the Emtech video, i.e. cellscience/cellschool. From my educator’s perspective, you might be interested in seeing some of the interactives created by the Concord Consortium as possible starting points:


I have a dream … destroy cancer !!!