1. Cool Blob Future Mac Os Catalina
  2. Future Mac Products
  3. Cool Blob Future Mac Os Catalina
  4. Cool Blob Future Mac Os 7
  5. Cool Blob Future Mac Os 11

FreeBSD is a free and open-source Unix-like operating system descended from the Berkeley Software Distribution (BSD), which was based on Research Unix.The first version of FreeBSD was released in 1993. In 2005, FreeBSD was the most popular open-source BSD operating system, accounting for more than three-quarters of all installed simply, permissively licensed BSD systems.

On Mac, all the apps that I’d want to open are in the /Applications directory and for the most part all have a.app extension, so we can get their names using something like this: d = '/Applications' apps = list(map(lambda x: x.split('.app')0, os.listdir(d))) List the directory and remove the extension. Such environments’ popularity depends on the users’ needs and the complexity of the system configuration. The Linux distros continue to grow in number due to its open-source nature and the growing Linux community support. DiRT 4 Game Play. Mac OS outwits Linux as a platform for developing and testing the latest high-end games.

The Microsoft Kinect sensor is a peripheral device (designed for XBox and windows PCs) that functions much like a webcam. However, in addition to providing an RGB image, it also provides a depth map. Meaning for every pixel seen by the sensor, the Kinect measures distance from the sensor. This makes a variety of computer vision problems like background removal, blob detection, and more easy and fun!

The Kinect sensor itself only measures color and depth. However, once that information is on your computer, lots more can be done like “skeleton” tracking (i.e. detecting a model of a person and tracking his/her movements). To do skeleton tracking you’ll need to use Thomas Lengling’s windows-only Kinect v2 processing libray. However, if you’re on a Mac and all you want is raw data from the Kinect, you are in luck! This library uses libfreenect and libfreenect2 open source drivers to access that data for Mac OS X (windows support coming soon).

What hardware do I need?

First you need a “stand-alone” kinect. You do not need to buy an Xbox.

  • Standalone Kinect Sensor v1. I believe this one comes with the power supply so you do not need a separate adapter listed next. However, if you have a kinect v1 that came with an XBox, it will not include the Kinect Sensor Power Supply.
  • Standalone Kinect Sensor v2. You also probably need the Kinect Adapter for Windows. Don’t be thrown off, although it says windows, this will allow you to connect it to your mac via USB. Finally, you’ll also want to make sure your computer supports USB 3. Most modern machines do, but if you are unsure you can find out more here for Mac OS X.

Cool Blob Future Mac Os Catalina

Some additional notes about different models:

  • Kinect 1414: This is the original kinect and works with the library documented on this page in the Processing 3.0 beta series.
  • Kinect 1473: This looks identical to the 1414, but is an updated model. It should work with this library, but I don’t have one to test. Please let me know if it does or does not!
  • Kinect for Windows version 1: ???? Help? Does this one work?
  • Kinect for Windows version 2: This is the brand spanking new kinect with all the features found in the XBox One Kinect. Also works with this library!

SimpleOpenNI

You could also consider using the SimpleOpenNI library and read Greg Borenstein’s Making Things See book. OpenNI has features (skeleton tracking, gesture recognition, etc.) that are not available in this library. Unfortunately, OpenNI was recently purchased by Apple and, while I thought it was shut, down there appear to be some efforts to revive it!. It’s unclear what the future will be of OpenNI and SimpleOpenNI.

I’m ready to get started right now

The easiest way to install the library is with the Processing Contributions ManagerSketch → Import Libraries → Add library and search for “Kinect”. A button will appear labeled “install”.If you want to install it manually download the most recent release and extract it in the libraries folder. Restart Processing, open up one of the examples in the examples folder and you are good to go!

What is Processing?

Processing is an open source programming language and environment for people who want to create images, animations, and interactions. Initially developed to serve as a software sketchbook and to teach fundamentals of computer programming within a visual context, Processing also has evolved into a tool for generating finished professional work. Today, there are tens of thousands of students, artists, designers, researchers, and hobbyists who use Processing for learning, prototyping, and production.

What if I don’t want to use Processing?

If you are comfortable with C++ I suggest you consider using openFrameworks or Cinder with the Kinect. These environments have some additional features and you also may get a C++ speed advantage when processing the depth data, etc.:

  • More resources from: The OpenKinect Project

What code do I write?

First thing is to include the proper import statements at the top of your code:

As well as a reference to a Kinect object, i.e.

Then in setup() you can initialize that kinect object:

If you are using a Kinect v2, use a Kinect2 object instead.

Once you’ve done this you can begin to access data from the kinect sensor. Currently, the library makes data available to you in five ways:

  • PImage (RGB) from the kinect video camera.
  • PImage (grayscale) from the kinect IR camera.
  • PImage (grayscale) with each pixel’s brightness mapped to depth (brighter = closer).
  • PImage (RGB) with each pixel’s hue mapped to depth.
  • int[] array with raw depth data (11 bit numbers between 0 and 2048).

Let’s look at these one at a time. If you want to use the Kinect just like a regular old webcam, you can access the video image as a PImage!

You can simply ask for this image in draw(), however, if you can also use videoEvent() to know when a new image is available.

If you want the IR image:

With kinect v1 cannot get both the video image and the IR image. They are both passed back via getVideoImage() so whichever one was most recently enabled is the one you will get. However, with the Kinect v2, they are both available as separate methods:

Now, if you want the depth image, you can request the grayscale image:

As well as the raw depth data:

Future Mac Products

For the kinect v1, the raw depth values range between 0 and 2048, for the kinect v2 the range is between 0 and 4500.

For the color depth image, use kinect.enableColorDepth(true);. And just like with the video image, there’s a depth event you can access if necessary.

Unfortunately, b/c the RGB camera and the IR camera are not physically located in the same spot, there is a stereo vision problem. Pixel XY in one image is not the same XY in an image from a camera an inch to the right. The Kinect v2 offers what’s called a “registered” image which aligns all the depth values with the RGB camera ones. This can be accessed as follows:

Finally, for kinect v1 (but not v2), you can also adjust the camera angle with the setTilt() method.

So, there you have it, here are all the useful functions you might need to use the Processing kinect library:

  • initDevice() — start everything (video, depth, IR)
  • activateDevice(int) - activate a specific device when multiple devices are connect
  • initVideo() — start video only
  • enableIR(boolean) — turn on or off the IR camera image (v1 only)
  • initDepth() — start depth only
  • enableColorDepth(boolean) — turn on or off the depth values as color image
  • enableMirror(boolean) — mirror the image and depth data (v1 only)
  • PImage getVideoImage() — grab the RGB (or IR for v1) video image
  • PImage getIrImage() — grab the IR image (v2 only)
  • PImage getDepthImage() — grab the depth map image
  • PImage getRegisteredImage() — grab the registered depth image (v2 only)
  • int[] getRawDepth() — grab the raw depth data
  • float getTilt() — get the current sensor angle (between 0 and 30 degrees) (v1 only)
  • setTilt(float) — adjust the sensor angle (between 0 and 30 degrees) (v1 only)

For everything else, you can also take a look at the javadoc reference.

Examples

There are four basic examples for both v1 and v2.

Display RGB, IR, and Depth Images

Code for v1:RGBDepthTest

Code for v2:RGBDepthTest2

This example uses all of the above listed functions to display the data from the kinect sensor.

Multiple devices

Both v1 and v2 has multiple kinect support.

Code for v1:MultiKinect

Code for v2:MultiKinect2

Point Cloud

Code for v1: PointCloud

Code for v2: PointCloud

Here, we’re doing something a bit fancier. Number one, we’re using the 3D capabilities of Processing to draw points in space. You’ll want to familiarize yourself with translate(), rotate(), pushMatrix(), popMatrix(). This tutorial is also a good place to start. In addition, the example uses a PVector to describe a point in 3D space. More here: PVector tutorial.

The real work of this example, however, doesn’t come from me at all. The raw depth values from the kinect are not directly proportional to physical depth. Rather, they scale with the inverse of the depth according to this formula:

Cool Blob Future Mac Os Catalina

Rather than do this calculation all the time, we can precompute all of these values in a lookup table since there are only 2048 depth values.

Thanks to Matthew Fisher for the above formula. (Note: for the results to be more accurate, you would need to calibrate your specific kinect device, but the formula is close enough for me so I’m sticking with it for now. More about calibration in a moment.)

Finally, we can draw some points based on the depth values in meters:

Average Point Tracking

The real magic of the kinect lies in its computer vision capabilities. With depth information, you can do all sorts of fun things like say: “the background is anything beyond 5 feet. Ignore it!” Without depth, background removal involves all sorts of painstaking pixel comparisons. As a quick demonstration of this idea, here is a very basic example that compute the average xy location of any pixels in front of a given depth threshold.

Source for v1: AveragePointTracking

Source for v2: AveragePointTracking2

In this example, I declare two variables to add up all the appropriate x’s and y’s and one variable to keep track of how many there are.

Then, whenever we find a given point that complies with our threshold, I add the x and y to the sum:

When we’re done, we calculate the average and draw a point!

What’s missing?

  • Everything is being tracked via github issues.

FAQ

  1. What are there shadows in the depth image (v1)? Kinect Shadow diagram
  2. What is the range of depth that the kinect can see? (v1) ~0.7–6 meters or 2.3–20 feet. Note you will get black pixels (or raw depth value of 2048) at both elements that are too far away and too close.

10.3: Enable the floating Exposé blob 38 comments Create New Account
Click here to return to the '10.3: Enable the floating Exposé blob' hint
The following comments are owned by whoever posted them. This site is not responsible for what they say.

How bizarre!
There's some other mysterious preference keys, such as:
[code]wvous-tl-corner
wvous-tr-corner
wvous-bl-corner
wvous-br-corner[/code]
No idea what these are, but the Dock set wvous-br-corner equal to 5 by itself. I have no idea what it means.
There's also:
[code]wvous-showcorners
wvous-floater-style
wvous-maindisplay
wvous-olddesktop
wvous-spring-delay
wvous-spring[/code]
I've tried setting wvous-showcorners, wvous-floater-style, wvous-spring-delay, and wvous-spring with no obvious effects. Can anybody else figure out what these are?

D'OH!
Note to self: READ the preview before submitting!

Computers

top left corner
bottom left corner
top right corner
bottom right corner
Looks like those keys store hot corner settings.

I figured that's what the abbreviations meant, but hot corners for what? Also, the key that the Dock stored by itself was the bottom-right cornet set to 5? What does that mean?

I Ran the command in Terminal and also from terminal 'killall Dock' Since then I logged out last night and in the morning I saw something strange!!!
There was a finder window open floating under the Login List window. I noticed that the Home Folder was labeled Root, and I Could run apps as Root! When I logged in as a user I never recieved the desktop, and Still Had my Finder window. When I ran an app, it Ran as that User. When going to Logout (from the Apple) I would see either log out 'System Administrator' or log out '(the user's name)' depending on the Application I was currently Running. After Logging out then Back in as another User, I recieved it's desktop items, and Everything looks back to normal.
I grabbed a screen shot, but do not know how to post it.
Very Interesting!!!!!

Go to
http://www.imageshack.com/
and upload it there. Post the URL it gives you here.

...Wow. I just looked at the pictures again and realized that you did something with iTunes I never knew was possible!
Apparently trying to shrink the minimized version will get rid of the status fields entirely!
Maybe there should be a hint about that...

haha! You and I both got two-for-one hint!

You should get a copy of Kelby's Jaguar Killer Tips, or just look through it in the bookstore. The teeny iTunes window tip is there. In 'mini window mode', just grab the resize handle (lower right corner) and resize.

Ahhh! That screenshot! It's Rover from 'The Prisoner'! Run away!

(I can't find a decent picture of Rover to link to, but this page does a decent job of explaining the reference and conveying the imagery almost as well as the original :-)

---
--
DO NOT LEAVE IT IS NOT REAL

Oh wow, you're right!

Ok, I did some playing around an this is what I found.
This will make little gray semi-circles appear in the corners
These refer to the exposé corners: tl=Top Left, tr=Top Right, bl=Bottom Left, br=Bottom Right Replace 'tl' with what corner you want, and 'x' with a number between 1 and 6. The number is the action that is performed by exposé 1=Nothing, the semicircle will disappear 2=All Windows 3=Application windows 4=Desktop 5=Start Screen Saver 6=Disable Screen Saver All these can be set from the Exposé preference pane.

This will change the Desktop effect. Instead of moving all the windows to the edges of the screen, it puts them all in a small box that can be dragged around the screen. Anyone had any luck with the others?

---
Do it today because tomorrow it may be illegal.

When I set wvous-olddesktop to 'no' to get the hyperzoomy effect, as soon as I zoomed to desktop then back, all metal windows stopped responding to mouse clicks - including Finder! So apparenly that effect was disabled because of some really weird interaction with the window server or something.
It's a shame, too, because that effect is much cooler than the windows all flying to the edge of the screen, and it could probably be argued that it's better for usability too (since it preserves spatial memory and so on).
Also, when I enabled the semicircles, the entire UI froze up as soon as I tried any Expose action (though iTunes kept on playing).
It might make a difference that I'm just running on a G4/450 with a Rage 128. (I'm getting a Radeon soon, honest!)

Wow, I love the olddesktop effect!
BTW, metal windows work perfectly fine for me after using it.
Erm, nevermind. The Preview button just stopped working until I scrolled the scrollbar manually (scroll wheel didn't work). But it's working now after trying the exposé effect again. Weird.

Actually, I know the problem now. It leaves an invisible window (I assume) in the area where the windows shrink to that sucks up all clicks in that area. So wherever you place the tiny window, you can't click. Turning olddesktop back on fixes it, though.

I figured out wvous-spring and wvous-spring-delay.
By default, if you drag a file (or probably any drag, but only tested with files from the Desktop) to an Exposé'd window (i.e. start drag, start Exposé, drag to window) and hover over a window for 1 second, it will flash and select itself. Just like spring-loaded folders (Space works for selecting the window as well).
If you do <tt>defaults write com.apple.Dock wvous-spring -bool false</tt> and restart the Dock, this behaviour disappears. To restore, do <tt>-bool true</tt> instead (or do a <tt>defaults delete com.apple.Dock wvous-spring</tt>).
To change the delay before the spring, set wvous-spring-delay. The value is an integer in milliseconds (i.e. 1000 1 second). For example, to set it to 2 seconds, type <tt>defaults write com.apple.Dock wvous-spring-delay -int 2000</tt>.

D'OH! I did it again. The relevant code snippets were:
I´ve had bad luck with the small box feature: resulting in mouse clicks were not recognized at the screen position of the small box - Whenever i was clicking in some app window around that area the small box would reside nothing happened.
This was driving me crazy for i had to shift the windows to get the mouse click working, and i was seeking some weeks to find the reason why...
So i got back to the old fashined way again.

Sure, it's the easy way - but fact is, The Blob, from crabby apple software, allows a set and forget blob. . . .of your own design! I'm always on a Powerbook, thus forever obsessing over screen real estate - but maybe you don't like overly-large glossy blue circles either.
Drop any graphic on The Blob's window. Now I have about a half-inch unobstrusive, pale blue green square in the corner below A-Dock's trash. That's plenty large as mouse/trackpad target.
Yeah, you have to enable the blob in Cocktail, Onyx, shell, whatever. Try the 'desktop square' option. . very cool.

Okay, I'm a little lost, so please bear with me. I used OnyX to turn on the blob and now the windows don't go to the sides when I hit F10.
If I go to Terminal and type
'defaults write com.apple.dock wvous-olddesktop -bool true'
minus the quotes, will that fix it?

D'oh!
F11. The one that's supposed to show the desktop.

I tried this in Panther and not only could I not get it to work, I lost my second monitor.
So far I have not been able to get it (the monitor) to come back. If I reboot into 10.8 which I have on another drive the monitor is still off but if I re-power it if comes back and works fine, until I start up in Panther again when it disappears.
I have a G4 mirrored drive door that is original except for two extra internal drives.

10.8? What are you, part of some super-secret developer program from the future? :D

When you said 10.8, did you mean 10.2.8? Expose is only a feature in 10.3 and higher.
If you added information to your com.apple.dock and you are using Jaguar, I'm guessing that you are going to have to remove it somehow. And I'm afraid I don't know how to go about doing that.
Does this affect all users, or just your current user?
I hope this helps, if only a little.

Cool Blob Future Mac Os 7

robg... Looks like you've got a secret (software engineer) admirer:
http://www.versiontracker.com/dyn/moreinfo/macosx/21323
-Jesse

Does changing the images change its size, or is there another way?
I would find this feature really nicely useful if I could make the blob a little smaller and toss it in a rather specific place on my screen (currently unoccupied) but it is too big right now.

Put the blob mostly offscreen in a corner and then you have a little blue semi-circle that when clicked triggers expose as noted in the hint. I put it off the bottom-left corner. Just a simple way to make the blob smaller. I like it. Of course you can set the corners to trigger expose just by moving to the corner, but I find that with such settings I frequently accidentally trigger expose. Gets annoying. The blob off the corner works much better for me. I wish I could set the default action to be show all windows instead of current apps windows. Any way to do that?
---
--- What?

I thought this would be pretty slick, but alas, when I enabled the Blob, and then tried to move it, I couldn't get the mouse pointer to drop the blob anywhere. The Blob stayed glued to the pointer. The keyboard could still talk to the computer, but the mouse was worthless (except as a Blob Bellboy).
Now, here's the setup for what it's worth: an AGP 400 with a kensington pocket mouse.
I'll just stick to slamming the mouse into the lower left corner...

I pulled the images out and shrunk them by 50% - works much better for me. Like others mentioned, the original was just too big.

I have panther, and I have enabled the blob, however I can't get rid of it. I've tried restarting after setting the bool value as me, and as root, and killall Dock. Nothin. If anyone has any advice let me know.

Ok, I fixed my problem by going to System/Library/CoreServices and then opening the dock package, then deleting the floater images. I archived them incase I want it back...but if anyone else had this problem that is a way to fix it.

I wish I had never seen this page. Right after typing that blue blob line into Terminal, my Kensington MouseWorks stopped working. I typed the 'false' line to get rid of the blob, but MouseWorks still isn't working. I uninstalled, reinstalled both the latest beta and the older final, to no avail. I think it did something to the dock, because as the system boots up, the MouseWorks drivers ARE working (the mouse moves very smoothly), and only cuts back to normal mouse movement when the Dock comes back up.
So my question is: what can you do to restore the Dock to the way it was before I typed anything into Terminal?

TransparentDock from
http://www.freerangemac.com
has a prefs selection to turn this puppy on and off

How come the Desktop effect on desktop is being ghost to the montior? When I click or drag anything on my desktop in same location as Desktop effect is at and it acted like the screen is locked into exact same form as Desktop effect...
---
www.artistz.tk

This is all great. I love the blob - my favorite way to activate Exposé. However, one question: how do I make the default action when I click on it the arrange-all-open-windows rather than the application windows of the current app? I would rather have all windows be the default and use the Option-Click combo for the other. (Too bad right-clicking with a two-button mouse doesn't do this!)
---
iMac 17, OS 10.3

Cool Blob Future Mac Os 11

This works well if you setup an AppleScript or Platypus app to turn it on and off. More info on Platypus here http://sveinbjorn.vefsyn.is/platypus .
to make a Platypus app I used TextEdit to make two one-line scripts. One to turn the blob on, and one to turn it off. Put the following line in TextEdit:
defaults write com.apple.dock wvous-floater -bool true; killall Dock
(change true to false for the script to turn it off.) From TextEdit's Format menu choose Make Plain Text. Save the script, then drag-and-drop it into Platypus and click 'Create'. Do the same for the script to turn it off, then put your Platypus apps in your Utilities folder, or where ever you may want to keep them.
Likewise, in AppleScript, this script will run the command in a terminal (if you don't want to mess with Platypus):
tell application 'Terminal'
activate
do script with command 'defaults write com.apple.dock wvous-floater -bool true; killall Dock'
end tell

It still works beautifully with Tiger

my blob disappeared... 10.3: Enable the floating Exposé blob

Running 10.4.10 on a new iMac; I got the blob to appear ONCE, when I tried to click and drag to another part of the desktop, it zipped way off to the upper left of the screen, so that only a slice of it was still showing; when I clicked on that, it disappeared entirely, never to return (despite multiple attempts using defaults write com.apple.dock wvous-floater -bool true and false) -- any suggestions??? I would love to have it back...