14 AFrame.IO Resources For Your WebXR Project

AFrame Logo

I’m a big fan of the work of the AFrame.IO community.  Thank you to Mozilla, Diego Marcos, Kevin Ngo, and Don McCurdy for their influence and effort to build a fun and productive platform for building WebVR experiences.   In this post, I’ve collected a few Github repositories and resources to support you in building AFrame experiences.

Talk Abstract: In the next few years, augmented reality and virtual reality will continue to provide innovations in gaming, education, and training. Other applications might include helping you tour your next vacation resort or explore a future architecture design. Thanks to open web standards like WebXR, web developers can leverage their existing skills in JavaScript and HTML to create delightful VR experiences. During this session, we will explore A-Frame.io, an open source project supported by Mozilla enabling you to craft VR experiences using JavaScript and a growing ecosystem of web components.

https://github.com/ngokevin/kframe
Kevin’s collection of A-Frame components and scenes.

https://webvr.donmccurdy.com/
Awesome WebXR from Don McCurdy

https://github.com/feiss/aframe-environment-component
Infinite background environments for your A-Frame VR scene in just one file.

https://github.com/aframevr/aframe-school
Interactive workshop and lessons for learning A-Frame and WebVR.

https://aframe.io/aframe-registry/
Official registry of cool AFrame stuff

https://github.com/donmccurdy/aframe-physics-system
Components for A-Frame physics integration, built on CANNON.js.

Experiment with AR and A-Frame
AFrame now has support for ARCore. Paint the real world with your XR content! Using FireFox Reality for iOS, you can leverage ARKit on your favorite IPad or IPhone.

https://github.com/michaelprosario/aframe
I’ve collected a small collection of demo apps to explore some of the core ideas of AFrame.

AFrame Layout Component
Automatically positions child entities in 3D space, with several layouts to choose from.

Animation
An animation component for A-Frame using anime.js. Also check out the animation-timeline component for defining and orchestrating timelines of animations.

Super Hands
All-in-one natural hand controller, pointer, and gaze interaction library for A-Frame. Seems to work well with Oculus Quest.

A-Frame Component loading Google Poly models from Google Poly
Component enables you to quickly load 3D content from Google Poly

aframe-htmlembed-component
HTML Component for A-Frame VR that allows for interaction with HTML in VR. Demo

https://github.com/nylki/aframe-lsystem-component
L-System/LSystem component for A-Frame to draw 3D turtle graphics. Using Lindenmayer as backend.

Thanks to the amazing work from Mozilla, WebXR usability has improved leveraging specialized FireFox browsers
FireFox Reality
FireFox Reality for HoloLens 2 – For raw ThreeJs scripts, it works well. I’m still doing testing on AFrame scenes.

If you live in Central Florida or Orlando, consider checking out our local chapter of Google developer Group.  We enjoy building a fun creative community of developers, sharing ideas, code, and supporting each other in the craft of software.  Learn more about our community here:

GDGCentralFlorida.org

Top Stories on InspiredToEducate.NET

Build a Space Shooter with Phaser3 and JavaScript(Tutorial3)

In this blog post series, I want to unpack building a 2D shooter game using Phaser3.js. Phaser3 provides a robust and fast game framework for early-stage JavaScript developers. In this tutorial, we will work to add aliens to the scene, give them some basic movement, and blowing them up. Sound like a plan? Here’s what we will build.

Please make sure to check out Tutorial 1 to get started with this project. You’ll need to build upon the code and ideas from the previous blog posts. (post 1, post 2)

To see the code in a completed state, feel free to visit this link. Let’s start by making some modifications to the scene class to preload an enemy sprite graphic. The PNG file will represent how the alien should be drawn to screen. We associate the name ‘enemy1’ with our PNG file.

class Scene1 extends Phaser.Scene {

    preload() {
        this.load.image('ship', 'assets/SpaceShooterRedux/PNG/playerShip1_orange.png');
        this.load.image('laser', 'assets/SpaceShooterRedux/PNG/Lasers/laserBlue01.png');
        this.load.image('enemy1', 'assets/SpaceShooterRedux/PNG/Enemies/enemyBlack3.png');
    }

    ...

In the Phaser game framework, we associate moving game entities with sprites. To define a sprite, we build out an enemy class. When we put a sprite into our scene(as the class is constructed), a special function will be called the constructor. We’ve designed the constructor so that we can set the enemy location at a point (x,y) coordinate and connect it to the scene.

In the constructor, we accomplish the following work. We set the texture of the sprite to ‘enemy1’ and set it the position. Next, we connect this sprite to the physics engine of the scene. We’ll use the physics engine to detect when the enemy gets hit by lasers. We also initialize the deltaX factor to 3. It’s not super exciting, but the aliens will shiver from side to side randomly. This, however, is good enough for a simple lesson. After to complete this tutorial, I encourage you to go crazy with making the aliens move any way you want!

    class Enemy1 extends Phaser.GameObjects.Sprite {

    constructor(scene, x, y) {
        super(scene, x, y);
        this.setTexture('enemy1');
        this.setPosition(x, y);
        scene.physics.world.enable(this);

        this.gameObject = this;
        this.deltaX = 3;
    }

    ...

Adding movement to aliens

So, we’re ready to start moving some aliens. Let’s do this! We’re going to write three simple methods on the Enemy1 class. Following the pattern of all Photon sprites, the update method will be called every game tick. It’s your job to tell the sprite how to move. Keep in mind, we’re going to do a simple “side to side” behavior randomly. In the update method, we start by picking a number between 0 and 3. If k is 2, we make the sprite move left using the “this.moveLeft()” function. Otherwise, we make it move to the right using “this.moveRight()”

    update() {
        let k = Math.random() * 4;
        k = Math.round(k);

        if (k == 2) {
            this.moveLeft();
        }
        else if (k == 3) {
            this.moveRight();
        }
    }

    moveLeft() {
        if (this.x > 0) {
            this.x -= this.deltaX;
        }
    }

    moveRight() {
        if (this.x < SCREEN_WIDTH) {
            this.x += this.deltaX;
        }
    }

Make lots of aliens

At this point, you want to see lots of moving aliens. Let’s add the code to the scene class to construct the aliens. In the scene class, the “create” method will be used to construct all objects. This includes our ship and the aliens. Firstly, we create a special collection object called enemies. We’ll use this collection to track the enemies with the physics system. (this.enemies = this.physics.add.group()) On the next line, we create an Array so that we have a simple way to track our enemies that need updating. In the loop, we’re creating 21 aliens, placing them in random locations, and adding them to our collections. (enemies and enemies2)

class Scene1 extends Phaser.Scene {

    ...

    create() {
        this.cursors = this.input.keyboard.createCursorKeys();
        this.myShip = new Ship(this, 400, 500);
        this.add.existing(this.myShip);

        // ======= adding enemies ============
        this.enemies = this.physics.add.group();
        this.enemies2 = new Array();

        let k = 0;
        for (k = 0; k < 21; k++) {
            let x = Math.random() * 800;
            let y = Math.random() * 400;

            this.enemy = new Enemy1(this, x, y);
            this.add.existing(this.enemy);
            this.enemies.add(this.enemy);
            this.enemies2.push(this.enemy);
        }
    }

In order to invoke our update code for all enemies, we need to make one more edit to the scene class. In the “update” method, we need to add a loop to call “update” on all enemies

    update() {
        // there's more code related to the ship here 

        let j = 0;
        for (j = 0; j < this.enemies2.length; j++) {
            let enemy = this.enemies2[j];
            enemy.update();
        }
    }

At this point, we should see our aliens wiggling on the screen. And there’s much rejoicing!

Aliens go boom! Let’s do collision detection

In the laser class that we built in the last post, we need to make a few edits. Check out the code below. In the constructor of the ShipLaser, we set the texture, position, speed, and store the parent scene in “this.scene.” We connect the laser instance to the physics engine using “scene.physics.world.enable.” In the next line, we tell the game framework to check for collisions between this laser and the enemies. When a collision happens, we handle the hit using the “handleHit” function.

    class ShipLaser extends Phaser.GameObjects.Sprite {

    constructor(scene, x, y) {
        super(scene, x, y);
        this.setTexture('laser');
        this.setPosition(x, y);
        this.speed = 10;
        this.scene = scene;

        // check out new code below ...
        scene.physics.world.enable(this);
        scene.physics.add.collider(this, scene.enemies, this.handleHit, null, this);
    }

In the handle hit function, you’ll notice that the laserSprite and enemySprite have been passed as parameters to the method. In Phaser, you can receive these references so that we can define behaviors associated with both sprites. In this case, we’re just going to destroy the objects.

    handleHit(laserSprite, enemySprite) {
        enemySprite.destroy(true);
        laserSprite.destroy(true);
    }

Hope this has been helpful. Please let me know if you have any questions.

Space shooter graphic

Recording Music and Audio with the Kids using Audacity

As a young person, my mom and dad invested a great deal in my growth as a musician. Looking back, I’m thankful that I’ve been able to use my gift of music to foster various ministries in our church. My wife and I love making music together by singing and playing the guitar. It’s honestly one of my favorite ways to re-charge and relax.

I wanted to give a shout out to a free tool that I have enjoyed using for basic music recording and talks. Audacity, a free and open source music recording software, has the ability to do a multi-track recording and has lots of basic effects. Audacity runs on Linux, Mac, and Windows. In contrast with other audio recording tools, I appreciate the simplicity of the user experience.

As a Dad, I’m excited to share the gift of music with my kids. My little girl has become very interested in singing lately. To help motivate her, I have started recording some of our jam sessions with Audacity. She loves showing off our work to mom. When my wife and I record music, I do use some professional mic equipment. For the recording sessions that I’m doing with my daughter, the laptop mic works just fine.

If you’re interested in starting a podcast, you might consider starting with Audacity. You can always advance to a more complex tool later. I found a comprehensive post on starting podcasts here. I do like their recommendation for purchasing a higher quality mic. In my experience, I’ve never had any issues with Audacity with advanced recording gear.

Here’s some of the key features that I appreciate from Audacity

  1. Multi-track recording: Let’s say that you want to record many singers or instrumentalists individually, Audacity enables you to layer individual tracks for each recording session. This enables you to edit, mute, solo and apply effects on an individual basis.
  2. Metronome: For some music recording situations, it’s helpful to have a metronome to help you align your tracks across sessions. You can add a metronome track by clicking “Generate > Rhythm track.” Audacity will enable you to set the tempo and generate a click track.
  3. Export to major audio formats: Out of the box, you can export your work to most popular audio formats like Wav, Ogg, and mp3. It’s pretty easy to share your work on services like SoundCloud.
  4. Effects: Audacity has many helpful effects for the entry-level sound engineer. You can amplify sound, apply compression, and apply reverb. When I’m playing with the kids in a silly manner, we sometimes enjoy becoming chipmunks by increasing the speed of tracks or adding lots of echoes.
  5. Editing audio: Audacity has a basic set of tools for editing audio. Once you’ve installed Audacity, you might check out David Taylor’s complete guide to Audacity. He provides a detailed introduction to the tool and many advanced features.

In researching this post, I found a pretty cool Edutopia article talking about the benefits of audio recording for writing. I like the idea of using an audio recording as a brainstorming tool. I also like the idea of reflecting on work by recording it and playing it back. I might try this idea as I’m teaching the kids piano.

Related Posts

Music Maker: Using NodeJS to Create Songs

Music maker screen shot

In my graduate school career, I had the opportunity with our evolutionary complexity lab to study creating music using neural networks and interactive genetic algorithms. It’s fun to study these two topics together since I enjoy crafting code and music. If you enjoy this too, you might enjoy Music maker, a library I created to explore generating music with code. Sonic Pi by Sam Aaron, a popular tool to teach music theory and Ruby code, inspired me to build this tool. My team from InspiredToEducate.NET enjoyed teaching a coding workshop on music using Sonic Pi. We, however, encountered a challenge of installing Sonic-Pi on a lab of computers. The workshop targets 5th-grade to 8th-grade students who have good typing skills. It would be cool if something like Sonic-Pi supported features like Blockly coding too.

In terms of musical motivations, I wanted to provide features like the following:

  • Like Sonic-Pi, the tool should make it easy to generate chords and scales.
  • I want it to feel simple like Sonic-Pi. I, however, don’t think I’ve achieved this yet.
  • I wanted the tool to have a concept of players who can generate music over a chord progression. I believe it would be cool to grow an ecosystem of players for various time signatures and musical types.
    I wanted to support the MIDI file format for output making it possible to blend music from this tool in sequencers broadly available on the market. This also enables us to print out sheet music using the MIDI files.
  • Building drum patterns can be tedious at times, I wanted to create a way to express rhythm simply.
  • We desired to have a browser-based interface that a teacher could install on a Raspberry Pi or some other computer. This was a key idea from one of my teachers.  I’m focusing on building a tool that works on a local area network.  (not the broad internet)
  • From a coding perspective, we need to build a tool that could interface with Blockly coding someday. JavaScript became a logical choice. I’ve wanted to explore a project the used TypeScript, NodeJS and Express too. I especially enjoyed using TypeScript for enums, classes, abstract classes, etc.

Here’s a sample MIDI file for your enjoyment:  jazz midi test

I do want to give a shout out to David Ingram of Google for putting together jsmidgen. David’s library handled all the low-level concerns for generating MIDI files, adding tracks, and notes. Please keep in mind that MIDI is a music protocol and file format that focuses on the idea of turning notes and off like switches over time. Make sure to check out his work. It’s great NodeJS library.

Here’s a quick tour of the API. It’s a work in progress.

Where do I get the code?

https://github.com/michaelprosario/music_maker

  • run app.js to load the browser application.   Once the application is running, you should find it running on http://localhost:3000 .
  • Make sure to check out the demo type scripts and music_maker.ts for sample code.

JSMIDGen reference

To learn more JSMIDGEN,
please visit https://github.com/dingram/jsmidgen

Hello world

var fs = require('fs');
var Midi = require('jsmidgen');
var Util = require('jsmidgen').Util;
import mm = require('./MusicMaker')

var beat=25;
var file = new Midi.File();

// Build a track
var track = new Midi.Track();
track.setTempo(80);
file.addTrack(track);

// Play a scale
var scale = mm.MakeScale("c4", mm.ScaleType.MajorPentatonic,2)

for(var i=0; i<scale.length; i++){
    track.addNote(0,scale[i],beat*2);
}

// Write a MIDI file
fs.writeFileSync('test.mid', file.toBytes(), 'binary');

Creating a new file and track

var file = new Midi.File();
var track = new Midi.Track();
track.setTempo(80);
file.addTrack(track);

// Play cool music here ...

Play three notes

track.addNote(0, mm.GetNoteNumber("c4"), beat);
track.addNote(0, mm.GetNoteNumber("d4"), beat);
track.addNote(0, mm.GetNoteNumber("e4"), beat);

Saving file to MIDI

fs.writeFileSync('test.mid', file.toBytes(), 'binary');

Playing a scale

var scale = mm.MakeScale("c4", mm.ScaleType.MajorPentatonic,2)

for(var i=0; i<scale.length; i++){
    track.addNote(0,scale[i],beat*2);
}

Playing drum patterns

var DrumNotes = mm.DrumNotes;
var addRhythmPattern = mm.AddRhythmPattern;
addRhythmPattern(track, "x-x-|x-x-|xxx-|x-xx",DrumNotes.ClosedHighHat);

Setup chord progression

var chordList = new Array();
chordList.push(new mm.ChordChange(mm.MakeChord("e4", mm.ChordType.Minor),4));
chordList.push(new mm.ChordChange(mm.MakeChord("c4", mm.ChordType.Major),4));
chordList.push(new mm.ChordChange(mm.MakeChord("d4", mm.ChordType.Major),4));
chordList.push(new mm.ChordChange(mm.MakeChord("c4", mm.ChordType.Major),4));

Play random notes from chord progression

var p = new mm.RandomPlayer
p.PlayFromChordChanges(track, chordList, 0);

Play root of chord every measure

var p = new mm.SimplePlayer
p.PlayFromChordChanges(track, chordList, 0);

Tour of chord players

var chordList = new Array();

// setup chord progression
chordList.push(new mm.ChordChange(mm.MakeChord("e4", mm.ChordType.Minor),4));
chordList.push(new mm.ChordChange(mm.MakeChord("c4", mm.ChordType.Major),4));
chordList.push(new mm.ChordChange(mm.MakeChord("d4", mm.ChordType.Major),4));
chordList.push(new mm.ChordChange(mm.MakeChord("c4", mm.ChordType.Major),4));

var chordPlayer = new mm.SimplePlayer
chordPlayer.PlayFromChordChanges(track, chordList, 0);

chordPlayer = new mm.Arpeggio1
chordPlayer.PlayFromChordChanges(track, chordList, 0);

chordPlayer = new mm.RandomPlayer
chordPlayer.PlayFromChordChanges(track, chordList, 0);

chordPlayer = new mm.BassPLayer1
chordPlayer.PlayFromChordChanges(track, chordList, 0);

chordPlayer = new mm.BassPLayer2
chordPlayer.PlayFromChordChanges(track, chordList, 0);

chordPlayer = new mm.BassPLayer3
chordPlayer.PlayFromChordChanges(track, chordList, 0);

chordPlayer = new mm.OffBeatPlayer
chordPlayer.PlayFromChordChanges(track, chordList, 0);

 

If you write music using Music Maker, got ideas for features, or you’d like to contribute to the project, drop me a line in the comments or contact me through GitHub!  Hope you have a great one!

 

“Growing Your Developer Career using Open Source” via @JohnBaluka

We are open

Whether you’re just starting in your career or you’ve been working in the industry for years, you can benefit from the culture and practice of open source. I want to thank John Baluka for sharing his reflections and personal journey on this topic. I really appreciate John’s fresh business perspective on using open source to advance your learning and business. I had the opportunity to hear him share his talk on this topic during an ONETUG meeting this past week. If you’re in the Orlando area, make sure to check out ONETUG. They’re a great community of programming professionals.

Some programming communities have stronger cultures of sharing and open source culture. As a web applications developer, we naturally love open source software. Programmers who leverage NodeJS and JavaScript operate in a very open way because the world wide web operates in that manner. I’ve been working as a C# developer for over 20 years. I’m very excited that our .NET community of developers has learned lessons from other languages and become open and collaborative. I still think it’s crazy that Microsoft has become the number one contributor to open source software. Stuff that used to be secret sauce has become open. On top of that, Microsoft has now bought GitHub.com. Look forward to seeing Microsoft and GitHub use their influence to increase the impact of open culture.

I believe that John hit on 5 thoughtful benefits for getting to know open source solutions. In John’s view, you need to be strategic on your investment of time.

1. Personal learning and growth: In John’s journey, he wanted to find an example of a large software architecture written in .NET and ASP.NET MVC. He selected NopCommerce, a cool e-commerce platform for .NET developers. John organized lessons and meta-patterns from dissecting this project into a talk. Some of the topics included dependency injection, language localization, data validation, plug-in architecture, and agile design. John offered us a challenge to select and study an open source project as a tool to advance your career in architecture or software leadership. On InspiredToEducate.NET, we have talked about this principle in the context of the makers movement. Everyone can learn something from reading code, exploring a 3D model, dissecting an electronics schematic, music, art, etc. What’s an open source project that fits into your space of passion?

2. Open source software enhances your public profile of work: When you hire an interior designer, how would you make your decision? You probably would review pictures of previous work to see if the designer fits with your tastes and requirements. For the average job interview in software engineering, it’s typically hard to show code from your previous gig. (i.e. corporate secrets, policies) Most companies don’t do their work in open source. By getting involved and contributing to an open source project, you can enhance your public profile of work. How does your GitHub reflect your strengths and skills?

3. Speed to solution: It’s important to remember that software developers aren’t paid to write code. We provide value and solve business problems. Open source software enables our teams to reduce time to market. Phil Haack, creator of ASP.NET MVC and engineer at GitHub, shared a reflection that businesses should always focus on their unique value proposition. (i.e. what makes your company different than other options ) Open source provides an opportunity for companies to partner or collaborate on elements outside of your unique value proposition. Why write a big workflow system or content system when you can integrate one?

4. Open source is social: To advance your career, it’s important to expand your network and relationships. Growing authentic relationships becomes critical in growing your business. By collaborating on open source, you have an opportunity to learn from others. You have the opportunity to invest and support peers around you. I personally get excited about supporting the growth of others.

5. Business models around open source software: I really appreciate John’s reflections on this aspect. I admire his pragmatic approach to selecting NopCommerce. On one level, the open source project followed good and clean patterns. In his view, the project isn’t perfect, but you can learn something from it. By sharing his reflections on the software design during user group meetups and conferences, he started getting consulting requests to support NopCommerce integrations. He challenged us to strategically select an open source project for learning with an eye toward job growth. In the NopCommerce space, you can earn money by building store themes, building plugins, providing support or integrations. Here’s a few more blog post that elaborate on this idea.

https://opensource.com/article/17/12/open-source-business-models
https://handsontable.com/blog/articles/5-successful-business-models-for-web-based-open-source-projects

What open source projects connect to your strengths, passions, and your career growth strategy? This was probably my favorite concept from John’s talk.

Again, I want to thank ONETUG and John Baluka for making this talk possible. I also appreciate John taking time after the meetup to hang out. I appreciate his accessibility.

Make sure to check out John’s talk and his resources.

Related Blog Posts

 

 

6 Resources To Build Cool Minecraft Mods with Python

Looking for a fun way to explore learning to code with your students or children? Consider exploring writing Minecraft mods using Python. In our house, we continue to enjoy building(destroying) together as a family in shared Minecraft worlds. I appreciate that Minecraft helps the kids exercise their thinking about working in 3D. The python language favoring concise expression, fast feedback and quick iteration will keep students engaged.

 

As a parent, I have been searching for ways to make learning math more attractive for one of my kids. In this particular case, he loves to read and often enjoys finding ways to avoid doing tasks related to math. I’m so thankful that he has developed a joy in reading. I don’t think I had that motivation at his age. During a trip to a bookstore, he expressed interest in the book “Learn to Program Minecraft” by Craig Richardson. As an experiment, we picked up the book to explore his engagement level. In one week, he got to chapter 4 and started requesting that we practice coding Minecraft together after school. I felt something like this.

Seymour Papert, a key influence in the learning theory of constructionism, aspired to create a math world where children would play with math as a learning tool. I believe that he would be proud of the various open source projects that connect Minecraft to computational thinking.

To help you get started with coding Minecraft mods with Python, I wanted to share a few tools to help you get started.

1. Raspberry Pi: The Raspberry Pi is a great $40 computer build to engage students in playing with physical computing and computer science. If you run the raspbian operating system on your Raspberry Pi, you already have a copy of Minecraft installed and related python tools.

2. Setup for Windows and Mac: If you run Minecraft(java edition) on a Windows or Mac OS, you will find the following tutorial from instructables helpful. The tutorial walks you through the process of setting up your Minecraft server, setting up the python api, and configuring your Minecraft environment.

http://www.instructables.com/id/Python-coding-for-Minecraft/

3. Getting Started with Minecraft Pi: This resource from the Raspberry Pi foundation provides a concise set of steps to get started. Make sure to check out the link on playing with TNT. (The kids enjoy that one!)

https://projects.raspberrypi.org/en/projects/getting-started-with-minecraft-pi/

4. MagPi Magazine issue on Minecraft coding: I’m a big supporter of the MagPi Magazine. I often give this magazine as a gift to my geek friends. They recently published an issue on Minecraft coding that you’d enjoy.

https://www.raspberrypi.org/magpi-issues/Essentials_Minecraft_v1.pdf

5. Minecraft Python API cheat sheet: For experienced programmers who need a quick reference guide to the Minecraft Python API, I found the following link helpful.

http://www.stuffaboutcode.com/p/minecraft-api-reference.html?m=1

6. www.codecademy.com: This interactive tutorial provides a fun way to get started with python programming and many other languages. People learn best when you see a new idea and immediately apply it. Code academy was designed with this learning pattern in mind. You are coached to immediately apply every new programming concept in an online code editor.

Related Blog Posts

 

Detecting Motion using Python, SimpleCV and a Raspberry Pi

Simple CV

My wife and kids enjoy bird watching. In our dining room, we attached a bird feeder to a window in the room. My wife asked if I could hack together a way to snap pictures of birds that visit the bird feeder. After doing some good searches, I realized that SimpleCV has some easy capabilities enabling you to create a motion detection system with Python. SimpleCV has become one of my favorite open source computer vision tools that you program using python. In general, computer vision is the branch of computer science that deals with understanding images and video. In this post, I’ll try to outline the major ideas from this script by the folks from SimpleCV. I made a few edits to the script to save video frames with motion to disk. To learn more about getting started with python programming, check out this blog post.

In the world of computers, a computer image exists as a grid of numbers. Each number represents a color. A pixel is a cell in this grid of numbers at a particular (x,y) position. SimpleCV enables you to capture an image from your web camera using the following code.


from SimpleCV import *
cam = Camera()
current = cam.getImage()

Let’s say we capture two images taken within a 1/2 second of each other.


previous = cam.getImage() #grab a frame
time.sleep(0.5) #wait for half a second
current = cam.getImage() #grab another frame
diff = current - previous

SimpleCV defines an image subtraction operation so that you can find the differences between two images. If the current and previous images are exactly the same, SimpleCV will compute a black image. (i.e. a grid of zeros) If current and previous images have substantial differences, some of the cells in the diff image will have positive values.

At this point, we compute a ‘mean’ factor using all the pixel values from the diff image. If the mean value is higher than a particular threshold, we know that a motion event occurred. We capture an image and store the image to a file.

You can review the complete code solution below.

The following web page from SimpleCV outlines other applications of image math.

http://tutorial.simplecv.org/en/latest/examples/image-math.html?highlight=motion

I think the image motion blur and green screen tutorials look fun too.

To install SimpleCV on a Raspberry Pi, check out the following link:
http://simplecv.readthedocs.io/en/latest/HOWTO-Install%20on%20RaspberryPi.html


from SimpleCV import *

cam = Camera()
threshold = 5.0 # if mean exceeds this amount do something
i = 0
disp = SimpleCV.Display((1024, 768))

while True:
previous = cam.getImage() #grab a frame
time.sleep(0.5) #wait for half a second
current = cam.getImage() #grab another frame
diff = current - previous
matrix = diff.getNumpy()
mean = matrix.mean()

current.save(disp)

if mean >= threshold:
print "Motion Detected " + str(i)

# capture the image. Display it. Save the image as a JPEG.
img = cam.getImage()
img.save('%.06d.jpg' % i)

# change the filename counter variable.
i += 1

Interested in learning more about SimpleCV? Check out the following PyCon conference video

Abstract: Katherine Scott: This talk is a brief summary of Computer Vision tutorial we proposed for PyCon. In this talk we will discuss what computer vision is, why it’s useful, what tools exist in the Python ecosystem, and how to apply it to your project.

5 Fun BBC Microbit Project Lessons

As I have reflected on various physical computing activities we tried with our kids, I started reviewing a novel microcontroller from our friends at the BBC, the micro:bit.   In addition to the BBC bringing us awesome stories like Dr. Who, this organization has invested their resources to help students connect to creative computing tools for young makers.   The BBC micro:bit continues this cool tradition by offering inexpensive microcontrollers to empower students to build robots, explore wearable computing, and invent new stuff.  The BBC micro:bit device has an amazing set of features: Bluetooth or radio communication, a compass sensor, shake sensor, a couple of push buttons, a grid of LED lights, compact battery pack and a good number of inputs and outputs.   The input/outputs enable the student to drive servos, drive speakers or connect to other electronics.  I love this platform since novice makers can program the microcontrollers with block programming.  Advanced students will enjoy the ability to program the microcontrollers with languages like JavaScript and Python.  That’s a lot of capability for a low-cost microcontroller under $30.  I believe the BBC micro:bit can be a fine alternative to an Arduino for beginners. 

 BBC:microbit Robot

The micro:bit community has done a great job of putting together helpful tutorials and lessons for a wide range of students.  

To help jump-start your imagination for lessons and projects that you can explore with the BBC micro:bit, check out some of the videos below.

Compass Challenge by MrAColley

BBC microbit Python Circuit and Music Project by “Teacher of Computing”

Micro:bit automatic watering system demo By ProtoPICVideos

Making a room alarm with your micro:bit by MicroMonsters

micro:bit radio-controlled buggy project by A79BEC

Related posts on physical computing

Connecting Community Service to Makerspaces and Developer Communities

Team Open Barter

In November 2017, I had the honor of speaking at DevFest Florida, a community organized developer conference focusing on Google technology. I had an amazing time at this conference. You can check out my reflections on this experience at GDGCentralFlorida.org. Readers of InspiredToEducate.NET know that we’re passionate about helping students to love learning through making, tinkering, and engineering. For me, I encountered a talk that impacted me regarding the intersection of community service and maker education. I do believe in Daniel Pink‘s argument that we’re very motivated or driven in situations where we have autonomy,  are growing in mastery, and acting with purpose. The projects that I’ll discuss in this post connect strongly to mastery and purpose motivations. This talk encouraged me to reflect on why I enjoy helping people to learn to code and the culture of a maker space.

Etienne Caron-Petit-Pas shared an amazing story of using mixed reality and maker technologies to create a positive social impact in this community. In OSMOS academy that he helps organize, I appreciated that their community focuses on building stuff that can help enrich people’s lives. It’s not just about the maker tech. For example, their current project focused on building playful VR experiences to support and distract kids who are going through medical procedures in a hospital. Some of the other projects they have attempted feel like citizen science efforts. This talk touches technology ranging from Google Daydream, augmented reality, Android Things and more.

In general, we’ve explored the idea that maker education connects students to the experience of project based learning. Under this paradigm, students engage in learning through the construction of projects or physical stuff. Learning is not centered around a teacher as the center of knowledge. Maker education learning experience always ask students to personalize the learning experience by asking the student: what do you want to make? All other lessons connect into project direction set by the student.

Along a similar theme, I recently encountered a cool podcast interviewing the founder of FreeCodeCamp.com, Quincy Larson. Quincy Larson worked in a traditional k-12 school working as a teacher. Along the way, he became interested in giving his fellow teachers more time by automating administrative computer tasks and creating systems for automatic grading. Through this experience, he became interested in learning to program professionally. After connecting with local makerspaces/hackathons, local meetups, and doing thousand of hours of study of MOOCs, he returned to his “teacher hat” and realized that many others might want to go on this journey too. He helped organize FreeCodeCamp.com to help other “campers” leverage resources and coaching he had gained. I’m very impressed with the scale of curriculum, community and effort to create local meetups in cities near you. While it’s easy to find YouTube videos or Mooc content to learn stuff, their teaching team acknowledges that learning as a local tribe in your local coffee shop or makerspace really helps to drive the learning forward. It’s very easy to get demotivated when you don’t have mentors or fellow students to go on the journey with you.

I do want to give a shout out to “The Change Log” podcast that shared this conversation.  I haven’t been listening to them long, but I enjoy their content.
https://changelog.com/podcast/195

FreeCodeCamp.com connects with the idea of community service learning by engaging real non-profits with real IT needs with their students. It’s a really neat “win-win” situation. The non-profit gets a cost effective solution. The students have a great learning experience addressing a local need while growing their web development skills.

On a personal level, I have enjoyed seeing students(young and old) become engaged with their path of learning through hackathons, makerspaces, and developer community. Why does community service learning matter? This feels like a unique flavor of project based learning since grass root connected learners work together to learn while making a difference in their community. The world needs more of this kind of innovation in education and community service.

Related blog posts

10 AFrame.IO Resources For Your WebVR Project

AFrame Logo

I’m a big fan of the work of the AFrame.IO community.  Thank you to Mozilla, Diego Marcos, Kevin Ngo, and Don McCurdy for their influence and effort to build a fun and productive platform for building WebVR experiences.   For some of my amigos from DevFestFlorida 2017, I’ve collected a few Github repositories and resources to support you in building AFrame experiences.

Thanks to the efforts of many GDG leaders and Traversoft, you can check out my talk at DevFestFL in the following video.  I had a great time connecting with other local web developers and sharing the WebVR love.    Hope you enjoy the talk.  And I hope you find the following links helpful.

Talk Abstract: In the next few years, augmented reality and virtual reality will continue to provide innovations in gaming, education, and training. Other applications might include helping you tour your next vacation resort or explore a future architecture design. Thanks to open web standards like WebVR, web developers can leverage their existing skills in JavaScript and HTML to create delightful VR experiences. During this session, we will explore A-Frame.io, an open source project supported by Mozilla enabling you to craft VR experiences using JavaScript and a growing ecosystem of web components.

https://github.com/ngokevin/kframe
Kevin’s collection of A-Frame components and scenes.

https://webvr.donmccurdy.com/
Awesome WebVR from Don McCurdy

https://github.com/archilogic-com/3dio-js
JavaScript toolkit for interior apps https://3d.io

https://github.com/feiss/aframe-environment-component
Infinite background environments for your A-Frame VR scene in just one file.

https://github.com/aframevr/aframe-school
Interactive workshop and lessons for learning A-Frame and WebVR.

https://github.com/scenevr/htmltexture-component
Aframe component for using html as a texture, powered by html2canvas

https://github.com/nylki/aframe-lsystem-component
L-System/LSystem component for A-Frame to draw 3D turtle graphics. Using Lindenmayer as backend.

https://aframe.io/aframe-registry/
Official registry of cool AFrame stuff

https://github.com/omgitsraven/aframe-room-component
A set of A-Frame components for quickly creating rooms connected by doors.

https://github.com/donmccurdy/aframe-physics-system
Components for A-Frame physics integration, built on CANNON.js.

https://github.com/michaelprosario/aframe
I’ve collected a small collection of demo apps to explore some of the core ideas of AFrame.

 

If you live in Central Florida or Orlando, consider checking out our local chapter of Google developer Group.  We enjoy building a fun creative community of developers, sharing ideas, code, and supporting each other in the craft of software.  Learn more about our community here:

GDGCentralFlorida.org

 

Top Stories on InspiredToEducate.NET