Search Results

Search found 119 results on 5 pages for 'robotics'.

Page 2/5 | < Previous Page | 1 2 3 4 5  | Next Page >

  • symbol lookup error in player-stage !

    - by Arkapravo
    Hi everyone ! I was getting a symbol lookup error in player stage, I rectified it after hours of hit and trial ! ( see my blogspot ). Even then in about 30 uses, the error resurfaces and I have to rectify it !. As of now, I am able to work fine ! the rectification just takes about 30 seconds. However, I am not sure why is this happening! Can someone please explain whatever is SYMBOL LOOKUP ERROR and why does it happens ? Many thanks ! Arkapravo

    Read the article

  • Rotation Matrix calculates by column not by row

    - by pinnacler
    I have a class called forest and a property called fixedPositions that stores 100 points (x,y) and they are stored 250x2 (rows x columns) in MatLab. When I select 'fixedPositions', I can click scatter and it will plot the points. Now, I want to rotate the plotted points and I have a rotation matrix that will allow me to do that. The below code should work: theta = obj.heading * pi/180; apparent = [cos(theta) -sin(theta) ; sin(theta) cos(theta)] * obj.fixedPositions; But it wont. I get this error. ??? Error using == mtimes Inner matrix dimensions must agree. Error in == landmarkslandmarks.get.apparentPositions at 22 apparent = [cos(theta) -sin(theta) ; sin(theta) cos(theta)] * obj.fixedPositions; When I alter forest.fixedPositions to store the variables 2x250 instead of 250x2, the above code will work, but it wont plot. I'm going to be plotting fixedPositions constantly in a simulation, so I'd prefer to leave it as it, and make the rotation work instead. Any ideas? Also, fixed positions, is the position of the xy points as if you were looking straight ahead. i.e. heading = 0. heading is set to 45, meaning I want to rotate points clockwise 45 degrees. Here is my code: classdef landmarks properties fixedPositions %# positions in a fixed coordinate system. [x, y] heading = 45; %# direction in which the robot is facing end properties (Dependent) apparentPositions end methods function obj = landmarks(numberOfTrees) %# randomly generates numberOfTrees amount of x,y coordinates and set %the array or matrix (not sure which) to fixedPositions obj.fixedPositions = 100 * rand([numberOfTrees,2]) .* sign(rand([numberOfTrees,2]) - 0.5); end function obj = set.apparentPositions(obj,~) theta = obj.heading * pi/180; [cos(theta) -sin(theta) ; sin(theta) cos(theta)] * obj.fixedPositions; end function apparent = get.apparentPositions(obj) %# rotate obj.positions using obj.facing to generate the output theta = obj.heading * pi/180; apparent = [cos(theta) -sin(theta) ; sin(theta) cos(theta)] * obj.fixedPositions; end end end P.S. If you change one line to this: obj.fixedPositions = 100 * rand([2,numberOfTrees]) .* sign(rand([2,numberOfTrees]) - 0.5); Everything will work fine... it just wont plot.

    Read the article

  • Finding distance travelled by robot using Optical Flow

    - by user280454
    Hi, I'm working on a project right now in which we are developing an autonomous robot. I have to basically find out the distance travelled by the robot between any 2 intervals. I'm using OpenCV, and using the Optical Flow functions of OpenCV, I'm able to find out the velocity/distance of each pixel in 2 different images. Using this information, I want to be able to find out the distance travelled by the robot in the interval between those 2 images. I thought of a way in which we could develop an input output mapping between the distance travelled by pixels and the distance travelled by the bot (using some tests). In this way, using neural networks, we would be able to find the relationship. However, the optical flow would depend on the distance of the camera from the pixel, which would cause problems. Is there any way to solve this problem?

    Read the article

  • Representing robot's elbow angle in 3-D

    - by Onkar Deshpande
    I am given coordinates of two points in 3-D viz. shoulder point and object point(to which I am supposed to reach). I am also given the length from my shoulder-to-elbow arm and the length of my forearm. I am trying to solve for the unknown position(the position of the joint elbow). I am using cosine rule to find out the elbow angle. Here is my code - #include <stdio.h> #include <math.h> #include <stdlib.h> struct point { double x, y, z; }; struct angles { double clock_wise; double counter_clock_wise; }; double max(double a, double b) { return (a > b) ? a : b; } /* * Check if the combination can make a triangle by considering the fact that sum * of any two sides of a triangle is greater than the remaining side. The * overlapping condition of links is handled separately in main(). */ int valid_triangle(struct point p0, double l0, struct point p1, double l1) { double dist = sqrt(pow((fabs(p1.z - p0.z)), 2) + pow((fabs(p1.y - p0.y)), 2) + pow((fabs(p1.x - p0.x)), 2)); if((max(dist, l0) == dist) && max(dist, l1) == dist) { return (dist < (l0 + l1)); } else if((max(dist, l0) == l0) && (max(l0, l1) == l0)) { return (l0 < (dist + l1)); } else { return (l1 < (dist + l0)); } } /* * Cosine rule is used to find the elbow angle. Positive value indicates a * counter clockwise angle while negative value indicates a clockwise angle. * Since this problem has at max 2 solutions for any given position of P0 and * P1, I am returning a structure of angles which can be used to consider angles * from both direction viz. clockwise-negative and counter-clockwise-positive */ void return_config(struct point p0, double l0, struct point p1, double l1, struct angles *a) { double dist = sqrt(pow((fabs(p1.z - p0.z)), 2) + pow((fabs(p1.y - p0.y)), 2) + pow((fabs(p1.x - p0.x)), 2)); double degrees = (double) acos((l0 * l0 + l1 * l1 - dist * dist) / (2 * l0 * l1)) * (180.0f / 3.1415f); a->clock_wise = -degrees; a->counter_clock_wise = degrees; } int main() { struct point p0, p1; struct angles a; p0.x = 15, p0.y = 4, p0.z = 0; p1.x = 20, p1.y = 4, p1.z = 0; double l0 = 5, l1 = 8; if(valid_triangle(p0, l0, p1, l1)) { printf("Three lengths can make a valid configuration \n"); return_config(p0, l0, p1, l1, &a); printf("Angle of the elbow point (clockwise) = %lf, (counter clockwise) = %lf \n", a.clock_wise, a.counter_clock_wise); } else { double dist = sqrt(pow((fabs(p1.z - p0.z)), 2) + pow((fabs(p1.y - p0.y)), 2) + pow((fabs(p1.x - p0.x)), 2)); if((dist <= (l0 + l1)) && (dist > l0)) { a.clock_wise = -180.0f; a.counter_clock_wise = 180.0f; printf("Angle of the elbow point (clockwise) = %lf, (counter clockwise) = %lf \n", a.clock_wise, a.counter_clock_wise); } else if((dist <= fabs(l0 - l1)) && (dist < l0)){ a.clock_wise = -0.0f; a.counter_clock_wise = 0.0f; printf("Angle of the elbow point (clockwise) = %lf, (counter clockwise) = %lf \n", a.clock_wise, a.counter_clock_wise); } else printf("Given combination cannot make a valid configuration\n"); } return 0; } However, this solution makes sense only in 2-D. Because clockwise and counter-clockwise are meaningless without an axis and direction of rotation. Returning only an angle is technically correct but it leaves a lot of work for the client of this function to use the result in meaningful way. How can I make the changes to get the axis and direction of rotation ? Also, I want to know how many possible solution could be there for this problem. Please let me know your thoughts ! Any help is highly appreciated ...

    Read the article

  • Find location using only distance and range?

    - by pinnacler
    Triangulation works by checking your angle to three KNOWN targets. "I know the that's the Lighthouse of Alexandria, it's located here (X,Y) on a map, and it's to my right at 90 degrees." Repeat 2 more times for different targets and angles. Trilateration works by checking your distance from three KNOWN targets. "I know the that's the Lighthouse of Alexandria, it's located here (X,Y) on a map, and I'm 100 meters away from that." Repeat 2 more times for different targets and ranges. But both of those methods rely on knowing WHAT you're looking at. Say you're in a forest and you can't differentiate between trees, but you know where key trees are. These trees have been hand picked as "landmarks." You have a robot moving through that forest slowly. Do you know of any ways to determine location based solely off of angle and range, exploiting geometry between landmarks? Note, you will see other trees as well, so you won't know which trees are key trees. Ignore the fact that a target may be occluded. Our pre-algorithm takes care of that. 1) If this exists, what's it called? I can't find anything. 2) What do you think the odds are of having two identical location 'hits?' I imagine it's fairly rare. 3) If there are two identical location 'hits,' how can I determine my exact location after I move the robot next. (I assume the chances of having 2 occurrences of EXACT angles in a row, after I reposition the robot, would be statistically impossible, barring a forest growing in rows like corn). Would I just calculate the position again and hope for the best? Or would I somehow incorporate my previous position estimate into my next guess? If this exists, I'd like to read about it, and if not, develop it as a side project. I just don't have time to reinvent the wheel right now, nor have the time to implement this from scratch. So if it doesn't exist, I'll have to figure out another way to localize the robot since that's not the aim of this research, if it does, lets hope it's semi-easy.

    Read the article

  • Guiding a Robot Through a Path

    - by Hamza Yerlikaya
    I have a field filled with obstacles, I know where they are located, and I know the robot's position. Using a path-finding algorithm, I calculate a path for the robot to follow. Now my problem is, I am guiding the robot from grid to grid but this creates a not-so-smooth motion. I start at A, turn the nose to point B, move straight until I reach point B, rinse and repeat until the final point is reached. So my question is: What kind of techniques are used for navigating in such an environment so that I get a smooth motion? The robot has two wheels and two motors. I change the direction of the motor by turning the motors in reverse. EDIT: I can vary the speed of the motors basically the robot is an arduino plus ardumoto, I can supply values between 0-255 to the motors on either direction.

    Read the article

  • Find location using only distance and bearing?

    - by pinnacler
    Triangulation works by checking your angle to three KNOWN targets. "I know the that's the Lighthouse of Alexandria, it's located here (X,Y) on a map, and it's to my right at 90 degrees." Repeat 2 more times for different targets and angles. Trilateration works by checking your distance from three KNOWN targets. "I know the that's the Lighthouse of Alexandria, it's located here (X,Y) on a map, and I'm 100 meters away from that." Repeat 2 more times for different targets and ranges. But both of those methods rely on knowing WHAT you're looking at. Say you're in a forest and you can't differentiate between trees, but you know where key trees are. These trees have been hand picked as "landmarks." You have a robot moving through that forest slowly. Do you know of any ways to determine location based solely off of angle and range, exploiting geometry between landmarks? Note, you will see other trees as well, so you won't know which trees are key trees. Ignore the fact that a target may be occluded. Our pre-algorithm takes care of that. 1) If this exists, what's it called? I can't find anything. 2) What do you think the odds are of having two identical location 'hits?' I imagine it's fairly rare. 3) If there are two identical location 'hits,' how can I determine my exact location after I move the robot next. (I assume the chances of having 2 occurrences of EXACT angles in a row, after I reposition the robot, would be statistically impossible, barring a forest growing in rows like corn). Would I just calculate the position again and hope for the best? Or would I somehow incorporate my previous position estimate into my next guess? If this exists, I'd like to read about it, and if not, develop it as a side project. I just don't have time to reinvent the wheel right now, nor have the time to implement this from scratch. So if it doesn't exist, I'll have to figure out another way to localize the robot since that's not the aim of this research, if it does, lets hope it's semi-easy.

    Read the article

  • I am trying to have a wall follow robot but there are errors on the names not being declared in my s

    - by Sam
    #include <iostream> #include <libplayerc++/playerc++.h> using namespace std; int main(int argc, char *argv[]) { using namespace PlayerCc; PlayerClient robot("localhost"); BumperProxy bp(&robot,0); Position2dProxy pp(&robot,0); pp.SetMotorEnable(true); for(;;) double turnrate, speed; double error; bool wall; motor_a_speed(0); motor_c_speed(0); while(1) { front_bumper = SENSOR_2; left_bumper = SENSOR_3; if (front_bumper > 2) { if (left_bumper < 3) { motor_a_speed(5); motor_c_speed(drive_speed); motor_a_dir(fwd); motor_c_dir(fwd); } else { motor_a_speed(drive_speed); motor_c_speed(5); motor_a_dir(rev); motor_c_dir(rev); } } else { motor_a_speed(drive_speed); motor_c_speed(drive_speed); motor_a_dir(brake); motor_c_dir(brake); mrest(100); cputs("bump"); motor_a_dir(fwd); motor_c_dir(rev); msleep(450); cputs("right"); motor_a_speed(10); motor_a_dir(fwd); motor_c_dir(fwd); mrest(1300); } pp.SetSpeed(speed, turnrate); } }

    Read the article

  • How to run Erlang based robot? Is it possible to convert it into .hex and run over microcontroller?

    - by Dinesh
    I am working on Erlang robotic project. I have made a wallfollower robot program which has two files 1. a C program to communicate with hardware(I think we can not directly use Erlang for this) and 2. Erlang program to call these functions. I want to know where(platforms) I can run this robot. Is it possible to run this robot over micro-controller(8051 or ARM7) based hardware? Is it possible to convert Erlang program into C code or directly into .hex file? If any one have any idea please help asap. Thanks.

    Read the article

  • How to create real-life robots?

    - by Click Upvote
    Even before I learnt programming I've been fascinated with how robots could work. Now I know how the underlying programming instructions would be written, but what I don't understand is how those intructions are followed by the robot. For example, if I wrote this code: object=Robot.ScanSurroundings(300,400); if (Objects.isEatable(object)) { Robot.moveLeftArm(300,400); Robot.pickObject(object); } How would this program be followed by the CPU in a way that would make the robot do the physical action of looking to the left, moving his arm, and such? Is it done primarily in binary language/ASM? Lastly, where would i go if I wanted to learn how to create a robot?

    Read the article

  • Hardware Programming - Hands-On Learning

    - by Sev
    Besides Arduino, what other ways are there to learn hardware programming in a hands-on way? Are there any nifty kits available, either a pre-assembled robot, that you can program to move a certain way, or do certain things, or anything similar to that?

    Read the article

  • Rapid Prototyping for Embedded Systems

    - by dr_pepper
    For doing prototyping on small embedded projects that require physical motion, what hardware prototyping tools are available? For my projects, I tend to spend more time finding parts (i.e. wood, aluminum, etc.) and making the proper cuts, measurements, and connections than writing the software and configuring the electrical hardware. Are there any affordable products that will enable me to create physical hardware that is strong enough to support motion? If not, what techniques or tools are available to help develop the physical hardware more quickly? Currently, I typically build my projects from wood and plastic scraps that I have lying around. What types of materials enable you to prototype more quickly? CLARIFICATION: By motion, I mean something that has to bear stress like a robot arm powered by a servo motor and could handle moving or carrying 1-2 lbs.

    Read the article

  • Why are C, C++, and LISP so prevalent in embedded devices and robots?

    - by David
    It seems that the software language skills most sought for embedded devices and robots are C, C++, and LISP. Why haven't more recent languages made inroads into these applications? For example, Erlang would seem particularly well-suited to robotic applications, since it makes concurrent programming easier and allows hot swapping of code. Python would seem to be useful, if for no other reason than its support of multiple programming paradigms. I'm even surprised that Java hasn't made a foray into general robotic programming. I'm sure one argument would be, "Some newer languages are interpreted, not compiled" - implying that compiled languages are quicker and use fewer computational resources. Is this still the case, in a time when we can put a Java Virtual Machine on a cell phone or a SunSpot? (and isn't LISP interpreted anyway?)

    Read the article

  • How to avoid that the robot gets trapped in local minimum?

    - by nesmoht
    Hi, I have some time occupying myself with motion planning for robots, and have for some time wanted to explore the possibility of improving the opportunities as "potential field" method offers. My challenge is to avoid that the robot gets trapped in "local minimum" when using the "potential field" method. Instead of using a "random walk" approach to avoid that the robot gets trapped I have thought about whether it is possible to implement a variation of A* which could act as a sort of guide for precisely to avoid that the robot gets trapped in "local minimum". Is there some of the experiences of this kind, or can refer to literature, which avoids local minimum in a more effective way than the one used in the "random walk" approach.

    Read the article

  • Java-Powered Robot Named NAO Wows Crowds

    - by Tori Wieldt
    He drew a crowd where he went at JavaOne. And only being 22.5 inches/573 mm tall, that's pretty impressive. Nao (pronounced now) is an autonomous, programmable humanoid robot developed by Aldebaran Robotics, a French robotics company. Over 200 academic institutions worldwide have made use of the robot. In this video from JavaOne, Nicolas Rigaud shows off the NAO robot which you can control with Java. We are eager to see what Java developers can do with a robot that can walk, talk, see, hear, and dance. &amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;gt; You can see several pictures in the blog Aldebaran Robotics at JavaOne. Learn more about the Aldebaran robotics developer program.

    Read the article

  • Thursday Community Keynote: "By the Community, For the Community"

    - by Janice J. Heiss
    Sharat Chander, JavaOne Community Chairperson, began Thursday's Community Keynote. As part of the morning’s theme of "By the Community, For the Community," Chander noted that 60% of the material at the 2012 JavaOne conference was presented by Java Community members. "So next year, when the call for papers starts, put-in your submissions," he urged.From there, Gary Frost, Principal Member of Technical Staff, AMD, expanded upon Sunday's Strategy Keynote exploration of Project Sumatra, an OpenJDK project targeted at bringing Java to heterogeneous computing platforms (which combine the CPU and the parallel processor of the GPU into a single piece of silicon). Sumatra entails enhancing the JVM to make maximum use of these advanced platforms. Within this development space, AMD created the Aparapi API, which converts Java bytecode into OpenCL for execution on such GPU devices. The Aparapi API was open sourced in September 2011.Whether it was zooming-in on a Mandelbrot set, "the game of life," or a swarm of 10,000 Dukes in a space-bound gravitational dance, Frost's demos, using an Aparapi/OpenCL implementation, produced stunningly faster display results. He indicated that the Java 9 timeframe is where they see Project Sumatra coming to ultimate fruition, employing the Lamdas of Java 8.Returning to the theme of the keynote, Donald Smith, Director, Java Product Management, Oracle, explored a mind map graphic demonstrating the importance of Community in terms of fostering innovation. "It's the sharing and mixing of culture, the diversity, and the rapid prototyping," he said. Within this topic, Smith, brought up a panel of representatives from Cloudera, Eclipse, Eucalyptus, Perrone Robotics, and Twitter--ideal manifestations of community and innovation in the world of Java.Marten Mickos, CEO, Eucalyptus Systems, explored his company's open source cloud software platform, written in Java, and used by gaming companies, technology companies, media companies, and more. Chris Aniszczyk, Operations Engineering,Twitter, noted the importance of the JVM in terms of their multiple-language development environment. Mike Olson, CEO, Cloudera, described his company's Apache Hadoop-based software, support, and training. Mike Milinkovich, Executive Director, Eclipse Foundation, noted that they have about 270 tools projects at Eclipse, with 267 of them written in Java. Milinkovich added that Eclipse will even be going into space in 2013, as part of the control software on various experiments aboard the International Space Station. Lastly, Paul Perrone, CEO, Perrone Robotics, detailed his company's robotics and automation software platform built 100% on Java, including Java SE and Java ME--"on rat, to cat, to elephant-sized systems." Milinkovic noted that communities are by nature so good at innovation because of their very openness--"The more open you make your innovation process, the more ideas are challenged, and the more developers are focused on justifying their choices all the way through the process."From there, Georges Saab, VP Development Java SE OpenJDK, continued the topic of innovation and helping the Java Community to "Make the Future Java." Martijn Verburg, representing the London Java Community (winner of a Duke's Choice Award 2012 for their activity in OpenJDK and JCP), soon joined Saab onstage. Verburg detailed the LJC's "Adopt a JSR" program--"to get day-to-day developers more involved in the innovation that's happening around them."  From its London launching pad, the innovative program has spread to Brazil, Morocco, Latvia, India, and more.Other active participants in the program joined Verburg onstage--Ben Evans, London Java Community; James Gough, Stackthread; Bruno Souza, SOUJava; Richard Warburton, jClarity; and Cecelia Borg, Oracle--OpenJDK Onboarding. Together, the group explored the goals and tasks inherent in the Adopt a JSR program--from organizing hack days (testing prototype implementations), to managing mailing lists and forums, to triaging issues, to evangelism—all with the goal of fostering greater community/developer involvement, but equally importantly, building better open standards. “Come join us, and make your ecosystem better!" urged Verburg.Paul Perrone returned to profile the latest in his company's robotics work around Java--including the AARDBOTS family of smaller robotic vehicles, running the Perrone MAX platform on top of the Java JVM. Perrone took his "Rumbles" four-wheeled robot out for a spin onstage--a roaming, ARM-based security-bot vehicle, complete with IR, ultrasonic, and "cliff" sensors (the latter, for the raised stage at JavaOne). As an ultimate window into the future of robotics, Perrone displayed a "head-set" controller--a sensor directed at the forehead to monitor brainwaves, for the someday-implementation of brain-to-robot control.Then, just when it seemed this might be the end of the day's futuristic offerings, a mystery voice from offstage pronounced "I've got some toys"--proving to be guest-visitor James Gosling, there to explore his cutting-edge work with Liquid Robotics. While most think of robots as something with wheels or arms or lasers, Gosling explained, the Liquid Robotics vehicle is an entirely new and innovative ocean-going 'bot. Looking like a floating surfboard, with an attached set of underwater wings, the autonomous devices roam the oceans using only the energy of ocean waves to propel them, and a single actuated rudder to steer. "We have to accomplish all guidance just by wiggling the rudder," Gosling said. The devices offer applications from self-installing weather buoy, to pollution monitoring station, to marine mammal monitoring device, to climate change data gathering, to even ocean life genomic sampling. The early versions of the vehicle used C code on very tiny industrial micro controllers, where they had to "count the bytes one at a time."  But the latest generation vehicles, which just hit the water a week or so ago, employ an ARM processor running Linux and the ARM version of JDK 7. Gosling explained that vehicle communication from remote locations is achieved via the Iridium satellite network. But because of the costs of this communication path, the data must be sent in very small bursts--using SBD short burst data. "It costs $1/kb, so that rules everything in the software design,” said Gosling. “If you were trying to stream a Netflix video over this, it would cost a million dollars a movie. …We don't have a 'big data' problem," he quipped. There are currently about 150 Liquid Robotics vehicles out traversing the oceans. Gosling demonstrated real time satellite tracking of several vehicles currently at sea, noting that Java is actually particularly good at AI applications--due to the language having garbage collection, which facilitates complex data structures. To close-out his time onstage, Gosling of course participated in the ceremonial Java tee-shirt toss out to the audience…In parting, Chander passed the JavaOne Community Chairperson baton to Stephen Chin, Java Technology Evangelist, Oracle. Onstage in full motorcycle gear, Chin noted that he'll soon be touring Europe by motorcycle, meeting Java Community Members and streaming live via UStream--the ultimate manifestation of community and technology!  He also reminded attendees of the upcoming JavaOne Latin America 2012, São Paulo, Brazil (December 4-6, 2012), and stated that the CFP (call for papers) at the conference has been extended for one more week. "Remember, December is summer in Brazil!" Chin said.

    Read the article

  • Security in Robots and Automated Systems

    - by Roger Brinkley
    Alex Dropplinger posted a Freescale blog on Securing Robotics and Automated Systems where she asks the question,“How should we secure robotics and automated systems?”.My first thought on this was duh, make sure your robot is running Java. Java's built-in services for authentication, authorization, encryption/confidentiality, and the like can be leveraged and benefit robotic or autonomous implementations. Leveraging these built-in services and pluggable encryption models of Java makes adding security to an exist bot implementation much easier. But then I thought I should ask an expert on robotics so I fired the question off to Paul Perrone of Perrone Robotics. Paul's build automated vehicles and other forms of embedded devices like auto monitoring of commercial vehicles on highways.He says that most of the works that robots do now are autonomous so it isn't a problem in the short term. But long term projects like collision avoidance technology in automobiles are going to require it.Some of the work he's doing with his Java-based MAX, set of software building blocks containing a wide range of low level and higher level software modules that developers can use to build simple to complex robot and automation applications faster and cheaper, already provide some support for JAUS compliance and because their based on Java, access to standards based security APIs.But, as Paul explained to me, "the bottom line is…it depends on the criticality level of the bot, it's network connectivity, and whether or not a standards compliance is required."

    Read the article

  • Java Spotlight Episode 138: Paul Perrone on Life Saving Embedded Java

    - by Roger Brinkley
    Interview with Paul Perrone, founder and CEO of Perrone Robotics, on using Java Embedded to test autonomous vehicle operations for the Insurance Institute for Highway Safety that will save lives. Right-click or Control-click to download this MP3 file. You can also subscribe to the Java Spotlight Podcast Feed to get the latest podcast automatically. If you use iTunes you can open iTunes and subscribe with this link: Java Spotlight Podcast in iTunes. Show Notes News JDK 8 is Feature Complete Java SE 7 Update 25 Released What should the JCP be doing? 2013 Duke's Choice Award Nominations Another Quick update to Code Signing Article on OTN Events June 24, Austin JUG, Austin, TX June 25, Virtual Developer Day - Java, EMEA, 10AM CEST Jul 16-19, Uberconf, Denver, USA Jul 22-24, JavaOne Shanghai, China Jul 29-31, JVM Summit Language, Santa Clara Sep 11-12, JavaZone, Oslo, Norway Sep 19-20, Strange Loop, St. Louis Sep 22-26 JavaOne San Francisco 2013, USA Feature Interview Paul J. Perrone is founder/CEO of Perrone Robotics. Paul architected the Java-based general-purpose robotics and automation software platform known as “MAX”. Paul has overseen MAX’s application to rapidly field self-driving robotic cars, unmanned air vehicles, factory and road-side automation applications, and a wide range of advanced robots and automaton applications. He fielded a self-driving autonomous robotic dune buggy in the historic 2005 Grand Challenge race across the Mojave desert and a self-driving autonomous car in the 2007 Urban Challenge through a city landscape. His work has been featured in numerous televised and print media including the Discovery Channel, a theatrical documentary, scientific journals, trade magazines, and international press. Since 2008, Paul has also been working as the chief software engineer, CTO, and roboticist automating rock star Neil Young’s LincVolt, a 1959 Lincoln Continental retro-fitted as a fully autonomous extended range electric vehicle. Paul has been an engineer, author of books and articles on Java, frequent speaker on Java, and entrepreneur in the robotics and software space for over 20 years. He is a member of the Java Champions program, recipient of three Duke Awards including a Gold Duke and Lifetime Achievement Award, has showcased Java-based robots at five JavaOne keynotes, and is a frequent JavaOne speaker and show floor participant. He holds a B.S.E.E. from Rutgers University and an M.S.E.E. from the University of Virginia. What’s Cool Shenandoah: A pauseless GC for OpenJDK

    Read the article

  • Concurrency and Coordination Runtime (CCR) Learning Resources

    - by Harry
    I have recently been learning the in's and out's of the Concurrency and Coordination Runtime (CCR). Finding good learning resources for this relatively new technology has been quite difficult. (A quick google search brings up "Creedence Clearwater Revival" as the top result!) Some of the resources I have found: Free e-book chapter from WROX on the Robotics Developer Studio Good Article/post on InfoQ Robotic's Member blog Very active MSDN CCR Forum - Got plenty of help from here! Great MSDN Magazine by Jeffrey Richter Official CCR User Guide - Didn't find this very helpful Great blogging series on CCR iodyner CCR Related Blog - Update: Moved to here Eight or so Videos on Channel9.msdn.com CCR Patterns page on MS Robotics Studio - I haven't read this yet 4 x CCR Questions on Stackoverflow - Most of the questions have been Mine! LOL CCR and DSS toolkit has now been released to MSDN Members Do you have any good learning resources for the CCR? I really hope that Microsoft will publish more material, so far it has been too Robotics specific. I believe that MS needs to acknowledge that most people are using the CCR in issolation from the DSS and Robotics Studio. Update The Mix 2010 conference had a presentation by Myspace about how they have used the CCR framework in their middle tier. They also open sourced the code base. MySpace DataRelay Mix Video Presentation

    Read the article

  • MS Tech Ed 2011 Coming Soon

    - by sonam
    Microsoft Tech ed 2010 was a great success. Infact  Most of such conferences always provide a great place to meet other  technology enthusiasts and ofcourse,whats in the pipeline for future products of a company or field.. And yet again,MS Tech ed India is coming on 23-25 march  in Banglore,India.Well,the place is  ofcourse right suited for any IT/Computing conference.After all,Its Silicon Valley of India.. From Last year.I remember  a session by Harish about  “Building pure client side apps with  Jquery and Microsoft Ajax .” Here’s the video: http://live.viasilverlight.com/TechEdOnDemand/Breakouts/TheWebSimplified1/Session4/AjaxClientSideApps.wmv At that time only,I got to know that jquery is so easy to use for  ajax or client side templating.Though I prefer jquery over  Microsoft Ajax many folds.UpdatePanel  is Dead for sure in my view. I believe,Web forms will be dead sooner or later with ASP.Net  MVC  gaining share many folds.(TODO:Learn MVC). The new standard is surely:JQUERY . Between,Last years videos and ppt’s  are available to browse and download: http://microsoftteched.in/2010/downloads.aspx After going through Tech Ad 2011 session agendas : http://www.microsoft.com/india/teched2011/agenda.aspx Few of my personal choices to watch would be: Day 1: a) Identity And Access Control in the Cloud        b)Windows 7 at  Home:Digitizing your Home.(Sounds cool.)        c) And ofcourse,Jquery and MS ajax(Lets see if MS can do something that’s not already happening with their version Of Ajax).. Day 2:  a) Lap Around Silverlight 5 and Html 5 as I have heard some hot talks that html5 will kill Silverlight,(I don’t see it in near future though).        b) Html 5 more than “Html 5”…Google will be seeing this one. Day3: a) Cross Browser applications in Azure       b)VS 2010 sessions of automated testing azure apps etc. Windows Phone 7 sessions will surely be of more interest now after MS-Nokia Deal. Though,Personally,I would want atleast some worth of  sessions on MS  future in Robotics,AI.Perhaps  I am looking at wrong place..(When is PDC?) And Since,Bill Gates  consider Robotics as the next big thing, Refer  this one : http://www.cs.virginia.edu/robins/A_Robot_in_Every_Home.pdf  I am sure,they wont loose this new hot spot to competitors,  like how google rules in Online  Search now.Robotics and AI will surely provide a big battlefield  for future.See,What IBM is doing with IBM Watson. OR see this, http://www.sciencedaily.com/releases/2011/02/110218083711.htm this is cool only if you can control your mind.Atleast,I’ll prefer regular driving (I would devote my mind seeing  people,places which we see on road).thats what jouney makes “cool”.:P.

    Read the article

  • "Well, Swing took a bit of a beating this week..."

    - by Geertjan
    One unique aspect of the NetBeans community presence at JavaOne 2012 was its usage of large panels to highlight and discuss various aspects (e.g., Java EE, JavaFX, etc) of NetBeans IDE usage and tools. For example, here's a pic of one of the panels, taken by Markus Eisele: Above you see me, Sean Comerford from ESPN.com, Gerrick Bivins from Halliburton, Angelo D'Agnano and Ioannis Kostaras from the NATO Programming Center, and Çagatay Çivici from PrimeFaces. (And Tinu Awopetu was also on the panel but not in the picture!) On one of those panels a remark was made which has kind of stuck with me. Henry Arousell, a member of the "NetBeans Platform Discussion Panel", who works on accounting software in Sweden, together with Thomas Boqvist, who was also at JavaOne, said, a bit despondently, I thought, the following words at the start of the demo of his very professional looking accounting software: "Well, Swing took a bit of a beating this week..." That remark comes in the light of several JavaFX sessions held at JavaOne, together with many sessions from the web and mobile worlds making the argument that the browser, tablet, and mobile platforms are the future of all applications everywhere. However, then I had another look at the list of Duke's Choice Award winners: http://www.oracle.com/us/corporate/press/1854931 OK, there are 10 winners of the Duke's Choice Award this year. Three of them (JDuchess, London Java Community, Student Nokia Developer Group) are not awards for software, but for people or groups. So, that leaves seven awards. Three of them (Hadoop, Jelastic, and Parleys) are, in one way or another, some kind of web-oriented solution, though both Hadoop and Jelastic are broader than that, but are service-oriented solutions, relating to cloud technologies. That leaves four others: NATO air defense software, Liquid Robotics software, AgroSense software, and UNHCR Refugee Registration software. All these are, on the software level, Java desktop solutions that, on the UI layer, make use of Java Swing, together with LuciadMaps (NATO), GeoToolkit (AgroSense), and WorldWind (Liquid Robotics). (And, it went even further than that, i.e., this is not passive usage of Swing but active and motivated: Timon Veenstra, during his AgroSense demo, said "There are far more Swing applications out there than we seem to think. Web developers just make more noise." And, during his Liquid Robotics demo, James Gosling said: "Not everything can be done in HTML.") Seems to me that Java Swing was the enabler of more Duke's Choice Award winners this year than any other UI-oriented Java technology. Now, I'm not going to interpret that one way or another, since I've noticed that interpretations of facts tend to validate some underlying agenda. Take any fact anywhere and you can interpret it to prove whatever opinion you're already holding to be true. Therefore, no interpretation from me. Simply stating the fact that Swing, far from taking a beating during JavaOne 2012, was a more significant user interface enabler of Duke's Choice Award winners than any other Java user interface technology. That's not an interpretation, but a fact.

    Read the article

< Previous Page | 1 2 3 4 5  | Next Page >