Search Results

Search found 787 results on 32 pages for 'augmented reality'.

Page 1/32 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Augmented Reality and Translation: Use Case in Enterprise?

    - by ultan o'broin
    Really love this iPhone app from Visual Quest: Word Lens Great to see the concept of augmented reality (a hot topic in UX) and translation coming together. Of course, I've downloaded the app and I'm trying it out already! Mashable say it all about this app in terms of how it seems like Sci-Fi is coming to life. However, the question remains: How could such an app be used in the enterprise applications space? Opinions welcome!

    Read the article

  • National Geographic Channel’s Live Augmented Reality–Awesome Video

    - by Gopinath
    Augmented reality blurs the line between what is real and what is computer generated by enhancing what we see, hear, feel and smell(know more). Appshaker recently launched a live Augmented reality campaign that lets you immerse with the scenes of National Geographic Channel – play with leopards, experience space landing, feel the dinosaurs, splash water along with dolphins and more.  Check the video for yourself to see the awesomeness. 1000s of Hungarians who had chance to feel and experience the above documented Augmented reality would have been few of the luckiest. This article titled,National Geographic Channel’s Live Augmented Reality–Awesome Video, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • How to begin with augmented reality?

    - by Terri
    I'm currently an undergrad in computer science and I'll be entering my final year next year. Augmented reality is something I find to be a really interesting topic, but I have no idea where to start learning about it. Where do you start learning about this topic and what libraries are available?

    Read the article

  • How to begin with augmented reality/compvision?

    - by Terri
    I'm currently an undergrad in computer science and I'll be entering my final year next year. Augmented reality is something I find to be a really interesting topic, but I have no idea where to start learning about it. Where do you start learning about this topic and what libraries are available?

    Read the article

  • A simple augmented reality application in C#

    - by Ian
    Hi All, I would like to start a personal project that will give me a lot of new concept to learn and understand. After some thought, I figured that an Augmented Reality related project will be the most beneficial for me because of the following reasons: I haven't tried interfacing a program with a live video/camera feed I haven't done any "image processing" project I haven't done any "graphics rendering" ever With that said, you can assume that I will totally suck at AR. So I'm here to ask for some advice regarding the best way to go through this. st milestone project that I am thinking of is to take a cheap webcam and read some Data Matrix using C#. nd milestone project will render some text-overlay on the feed when presented with a certain Data Matrix. rd milestone project will actually render some 3D shapes. I've searched all-around and found some nice yet advanced materials for AR: http://sites.google.com/site/augmentedrealitytestingsite/ http://soldeveloper.com/ http://www.mperfect.net/wpfAugReal/ So I came here to ask the following: Do you think my milestone is reasonable given my "deficiencies"? In order to start Milestone 1, can you give some materials that I can study? I prefer online materials. Thanks!

    Read the article

  • Monotouch Augmented Reality

    - by cvista
    Hi I'm trying to use the code posted here: http://seanho.posterous.com/monotouch-first-attempt-arkit-c-version however - when i try to overlay it on a camera - it seems to behave really strangely. I'm guess that it's because the camera view only does portrait? Has anyone succesfully used this? Or maybe know's how to get this working? Cheers w://

    Read the article

  • How to present an Android Augmented Reality application?

    - by Egor
    I'm currently working on my diploma project - an application for Android that uses the Augmented Reality technology. I'll have to make a presentation of my app and I can't really think of a way to show this kind of application to the crowd. Anyone has ideas? Will really appreciate an elegant solution! EDIT The application shows users the information about the places around, some kind of Layar + Google Places. So the issue is that to use the application you must turn around with your device in hands to see the icons on the screen. Presentation would have been easier if I could just lay the device on the table and use the app.

    Read the article

  • Fiction to Reality Timeline Charts Introduction of Sci-Fi Concepts to Real Life

    - by Jason Fitzpatrick
    Videophones, voice-controlled computers, heads-up displays, and other technological innovations made their first appearances in Sci-Fi. This dual timeline charts the first appearance in Sci-Fi against the date of commercial success for the product in the real world. Hit up the link below for the full resolution image. The Fiction to Reality Timeline [via Cool Inforgraphics] How to Own Your Own Website (Even If You Can’t Build One) Pt 3 How to Sync Your Media Across Your Entire House with XBMC How to Own Your Own Website (Even If You Can’t Build One) Pt 2

    Read the article

  • Internet of Things Becoming Reality

    - by kristin.jellison
    The Internet of Things is not just on the radar—it’s becoming a reality. A globally connected continuum of devices and objects will unleash untold possibilities for businesses and the people they touch. But the “things” are only a small part of a much larger, integrated architecture. A great example of this comes from the healthcare industry. Imagine an expectant mother who needs to watch her blood pressure. She lives in a mountain village 100 miles away from medical attention. Luckily, she can use a small “wearable” device to monitor her status and wirelessly transmit the information to a healthcare hub in her village. Now, say the healthcare hub identifies that the expectant mother’s blood pressure is dangerously high. It sends a real-time alert to the patient’s wearable device, advising her to contact her doctor. It also pushes an alert with the patient’s historical data to the doctor’s tablet PC. He inserts a smart security card into the tablet to verify his identity. This ensures that only the right people have access to the patient’s data. Then, comparing the new data with the patient’s medical history, the doctor decides she needs urgent medical attention. GPS tracking devices on ambulances in the field identify and dispatch the closest one available. An alert also goes to the closest hospital with the necessary facilities. It sends real-time information on her condition directly from the ambulance. So when she arrives, they already have a treatment plan in place to ensure she gets the right care. The Internet of Things makes a huge difference for the patient. She receives personalized and responsive healthcare. But this technology also helps the businesses involved. The healthcare provider achieves a competitive advantage in its services. The hospital benefits from cost savings through more accurate treatment and better application of services. All of this, in turn, translates into savings on insurance claims. This is an ideal scenario for the Internet of Things—when all the devices integrate easily and when the relevant organizations have all the right systems in place. But in reality, that can be difficult to achieve. Core design principles are required to make the whole system work. Open standards allow these systems to talk to each other. Integrated security protects personal, financial, commercial and regulatory information. A reliable and highly available systems infrastructure is necessary to keep these systems running 24/7. If this system were just made up of separate components, it would be prohibitively complex and expensive for almost any organization. The solution is integration, and Oracle is leading the way. We’re developing converged solutions, not just from device to datacenter, but across devices, utilizing the Java platform, and through data acquisition and management, integration, analytics, security and decision-making. The Internet of Things (IoT) requires the predictable action and interaction of a potentially endless number of components. It’s in that convergence that the true value of the Internet of Things emerges. Partners who take the comprehensive view and choose to engage with the Internet of Things as a fully integrated platform stand to gain the most from the Internet of Things’ many opportunities. To discover what else Oracle is doing to connect the world, read about Oracle’s Internet of Things Platform. Learn how you can get involved as a partner by checking out the Oracle Java Knowledge Zone. Best regards, David Hicks

    Read the article

  • Enterprise 2.0: Expectations vs. Reality

    - by kellsey.ruppel(at)oracle.com
    If you haven't heard it already, check out the podcast interview that Enterprise 2.0 expert John Brunswick did with Bob Rhubart, host of ArchBeat Podcast. Listen as John discusses some of the expectations vs. reality when it comes to Enterprise 2.0. Listen to Part 1 Listen to Part 2 Listen to Part 3 You can connect with John Brunswick and learn more about Enterprise 2.0 at the following links: John's Homepage Twitter: @johnbrunswick Linked In Oracle Fusion ECM Blog AIIM Enterprise 2.0 Blog Enterprise 2.0 Workbench (Youtube) JSP and Beyond (ebook) OTN Technical Articles: Extending the Business Value of SOA through Business Process Management Unlocking the Value of Enterprise 2.0 Collaboration and Authoring Tools Principles of Natural Participation And here are some additional links if you are interested in learning more about Bob Rhubart and ArchBeat: ArchBeat blog ArchBeat Podcast Oracle Architect Community Mix Profile Linked In FriendFeed Twitter: @brhubart

    Read the article

  • test coverage reality

    - by iPhoneDeveloper
    I am NOT doing test driven development and I write my test classes after the actual code is written. In my current project I have a test coverage of(Line coverage) %70 for 3000 lines of Java code.(Using JUnit, Mockito and Sonar for testing) But while I feel actually I am not covering and catching %70 of the problems that can occur. So my question is in theory is that possible to have a %100 Line coverage but in reality it is meaningless because of low quality of the test code and maybe a %40 well written test code is much better than a bad %100 coverage? or we can always say line coverage more or less gives the percentage of all covered issues?

    Read the article

  • Graduate expectations versus reality

    - by Bobby Tables
    When choosing what we want to study, and do with our careers and lives, we all have some expectations of what it is going to be like. Now that I've been in the industry for almost a decade, I've been reflecting a bit on what I thought (back when I was studying Computer Science) programming working life was going to be like, and how it's actually turning out to be. My two biggest shocks (or should I say, broken expectations) by far are the sheer amount of maintenance work involved in software, and the overall lack of professionalism: Maintenance: At uni, we were all told that the majority of software work is maintenance of existing systems. So I knew to expect this in the abstract. But I never imagined exactly how overwhelming this would turn out to be. Perhaps it's something I mentally glazed over, and hoped I'd be building cool new stuff from scratch a lot more. But it really is the case that most jobs are overwhelmingly maintenance, bug fixing, and support oriented. Lack of professionalism: At uni, I always had the impression that commercial software work is very process-oriented and stringently engineered. I had images of ISO processes, reams of technical documentation, every feature and bug being strictly documented, and a generally professional environment. It came as a huge shock to realise that most software companies operate no differently to a team of students working on a large semester-long project. And I've worked in both the small agile hack shop, and the medium sized corporate enterprise. While I wouldn't say that it's always been outright "unprofessional", it definitely feels like the software industry (on the whole) is far from the strong engineering discipline that I expected it to be. Has anyone else had similar experiences to this? What are the ways in which your expectations of what our profession would be like were different to the reality?

    Read the article

  • Kinect hacked for augmented reality

    - by Kit Ong
    It seems Kinect has more potential than any other consoles based motion detection device given the number of hacks that are out there in the wild. http://uk.videogames.games.yahoo.com/blog/article/19744/kinect-as-youve-never-seen-it-before.html Direct links to youtube videos of Kinect hacks: http://www.youtube.com/watch?v=M-wLOfjVfVc?fs=1&hl=en_GB http://www.youtube.com/watch?v=eWmVrfjDCyw?fs=1&hl=en_GB http://www.youtube.com/watch?v=P3gfMXwQOGI?fs=1&hl=en_GB http://www.youtube.com/watch?v=4qhXQ_1CQjg?fs=1&hl=en_GB http://www.youtube.com/watch?v=VgLp-KyK5g8?fs=1&hl=en_GB http://www.youtube.com/watch?v=CeQwhujiWVk?fs=1&hl=en_GB

    Read the article

  • Real-time Big Data Analytics is a reality for StubHub with Oracle Advanced Analytics

    - by Mark Hornick
    What can you use for a comprehensive platform for real-time analytics? How can you process big data volumes for near-real-time recommendations and dramatically reduce fraud? Learn in this video what Stubhub achieved with Oracle R Enterprise from the Oracle Advanced Analytics option to Oracle Database, and read more on their story here. Advanced analytics solutions that impact the bottom line of a business are challenging due to the range of skills and individuals involved in realizing such solutions. While we hear a lot about the role of the data scientist, that role is but one piece of the puzzle. Advanced analytics solutions also have an operationalization aspect that also requires close proximity to where the transactional activity occurs. The data scientist needs access to the right data with which to model the business problem. This involves IT for data collection, management, and administration, as well as ensuring zero downtime (a website needs to be up 24x7). This also involves working with the data scientist to keep predictive models refreshed with the latest scripts. Integrating advanced analytics solutions into enterprise apps involves not just generating predictions, but supporting the whole life-cycle from data collection, to model building, model assessment, and then outcome assessment and feedback to the model building process again. Application and web interface designers need to take into account how end users will see and use the advanced analytics results, e.g., supporting operations staff that need to handle the potentially fraudulent transactions. As just described, advanced analytics projects can be "complicated" from just a human perspective. The extent to which software can simplify the interactions among users and systems will increase the likelihood of project success. The ability to quickly operationalize advanced analytics projects and demonstrate measurable value, means the difference between a successful project and just a nice research report. By standardizing on Oracle Database and SQL invocation of R, along with in-database modeling as found in Oracle Advanced Analytics, expedient model deployment and zero downtime for refreshing models becomes a reality. Meanwhile, data scientists are also able to explore leading edge techniques available in open source. The Oracle solution propels the entire organization forward to realize the value of advanced analytics.

    Read the article

  • How to get Augmented Reality: A Practical Guide examples working?

    - by Glen
    I recently bought the book: Augmented Reality: A Practical Guide (http://pragprog.com/titles/cfar/augmented-reality). It has example code that it says runs on Windows, MacOS and Linux. But I can't get the binaries to run. Has anyone got this book and got the binaries to run on ubuntu? I also can't figure out how to compile the examples in Ubuntu. How would I do this? Here is what it says to do: Compiling for Linux Refreshingly, there are no changes required to get the programs in this chapter to compile for Linux, but as with Windows, you’ll first have to find your GL and GLUT files. This may mean you’ll have to download the correct version of GLUT for your machine. You need to link in the GL, GLU, and GLUT libraries and provide a path to the GLUT header file and the files it includes. See whether there is a glut.h file in the /usr/include/GL directory; otherwise, look elsewhere for it—you could use the command find / -name "glut.h" to search your entire machine, or you could use the locate command (locate glut.h). You may need to customize the paths, but here is an example of the compile command: gcc -o opengl_template opengl_template.cpp -I /usr/include/GL -I /usr/include -lGL -lGLU -lglut gcc is a C/C++ compiler that should be present on your Linux or Unix machine. The -I /usr/include/GL command-line argument tells gcc to look in /usr/include/GL for the include files. In this case, you’ll find glut.h and what it includes. When linking in libraries with gcc, you use the -lX switch—where X is the name of your library and there is a correspond- ing libX.a file somewhere in your path. For this example, you want to link in the library files libGL.a, libGLU.a, and libglut.a, so you will use the gcc arguments -lGL -lGLU -lglut. These three files are found in the default directory /usr/lib/, so you don’t need to specify their location as you did with glut.h. If you did need to specify the library path, you would add -L to the path. To run your compiled program, type ./opengl_template or, if the current directory is in your shell’s paths, just opengl_template. When working in Linux, it’s important to know that you may need to keep your texture files to a maximum of 256 by 256 pixels or find the settings in your system to raise this limit. Often an OpenGL program will work in Windows but produce a blank white texture in Linux until the texture size is reduced. The above instructions make no sense to me. Do I have to use gcc to compile or can I use eclipse? If I use either eclipse or gcc what do I need to do to compile and run the program?

    Read the article

  • Best Practices vs Reality

    - by RonHill
    On a scale depicting how closely best practices are followed, with "always" on one end and "never" on the other, my current company falls uncomfortably close to the latter. Just a couple trivial examples: We have no code review process There is very little documentation despite a very large code base (and some of it is blatantly incorrect/misleading) Untested/buggy/uncompilable code is frequently checked in to source control It is comically complicated to create a debuggable build for some of our components because of its underlying architecture. Unhandled exceptions are not uncommon in our releases Empty Catch{ } blocks are everywhere. Now, with the understanding that it's neither practical nor realistic to follow ALL best practices ALL the time, my question is this: How closely have commonly accepted best practices been followed at the companies you've worked for? I'm kind of a noob--this is only the second company I've worked for--so I'm not sure if I'm just more of an anal retentive coder or if I've just ended up at mediocre companies. My guess (hope?) is the latter, but a coworker with way more experience than me says every company he's ever worked for is like this. Given the obvious benefits of following most best practices most of the time, I find it hard to believe it's like this everywhere. Am I wrong?

    Read the article

  • The Arab HEUG is now a reality, and other random thoughts

    - by user9147039
    I just returned from Doha, Qatar where the first of its kind HEUG (Higher Education User Group) meeting for institutions in the Middle East and North Africa was held at Qatar University and jointly hosted by Damman University from Saudi Arabia. Over 80 delegates attended including representation from education institutions in Oman, Saudi Arabia, Lebanon, and Qatar. There are many other regional HEUG organizations in place (in Australia/New Zealand, APAC, EMEA, as well as smaller regional HEUG’s in the Netherlands, South Africa, and in regions of the US), but it was truly an accomplishment to see this Middle East/North Africa group organize and launch their chapter with a meeting of this quality. To be known as the Arab HEUG going forward, I am excited about the prospects for sharing between the institutions and for the growth of Oracle solutions in the region. In particular the hosts for the event (Qatar University) did a masterful job with logistics and organization, and the quality of the event was a testament to their capabilities. Among the more interesting and enlightening presentations I attended were one from Dammam University on the lessons learned from their implementation of Campus Solutions and transition off of Banner, as well as the use by Qatar University E-business Suite for grants management (both pre-and post-award). The most notable fact coming from this latter presentation was the fit (89%) of e-Business Suite Grants to the university’s requirements. In a few weeks time we will be convening the 5th meeting of the Oracle Education & Research Industry Strategy Council in Redwood Shores (5th since my advent into my current role). The main topics of discussion will be around our Higher Education Applications Strategy for the future (including cloud approaches to ERP (HCM, Finance, and Student Information Systems), how some cases studies on the benefits of leveraging delivered functionality and extensibility in the software (versus customization). On the second day of the event we will turn our attention to Oracle in Research and also budgeting and planning in higher education. Both of these sessions will include significant participation from council members in the form of panel discussions. Our EVP’s for Systems (John Fowler) and for Global Cloud Services and North America application sales (Joanne Olson) will join us for the discussion. I recently read a couple of articles that were surprising to me. The first was from Inside Higher Ed on October 15 entitled, “As colleges prepare for major software upgrades, Kuali tries to woo them from corporate vendors.” It continues to disappointment that after all this time we are still debating whether it is better to build enterprise software through open or community source initiatives when fully functional, flexible, supported, and widely adopted options exist in the marketplace. Over a decade or more ago when these solutions were relatively immature and there was a great deal of turnover in the market I could appreciate the initiatives like Kuali. But let’s not kid ourselves – the real objective of this movement is to counter a perceived predatory commercial software industry. Again, when commercial solutions are deployed as written without significant customization, and standard business processes are adopted, the cost of these solutions (relative to the value delivered) is quite low, and certain much lower than the massive investment (and risk) in in-house developers to support a bespoke community source system. In this era of cost pressures in education and the need to refocus resources on teaching, learning, and research, I believe it’s bordering on irresponsible to continue to pursue open-source ERP. Many of the adopter’s total costs are staggering and have little to show for their efforts and expended resources. The second article was recently in the Chronicle of Higher Education and was entitled “’Big Data’ Is Bunk, Obama Campaign’s Tech Guru Tells University Leaders.” This one was so outrageous I almost don’t want to legitimize it by referencing it here. In the article the writer relays statements made by Harper Reed, President Obama’s former CTO for his 2012 re-election campaign, that big data solutions in education have no relevance and are akin to snake oil. He goes on to state that while he’s a fan of data-driven decision making in education, most of the necessary analysis can be accomplished in Excel spreadsheets. Yeah… right. This is exactly what ails education (higher education in particular). Dozens of shadow and siloed systems running on spreadsheets with limited-to-no enterprise wide initiatives to harness the data-rich environment that is a higher ed institution and transform the data into useable information. I’ll grant Mr. Reed that “Big Data” is overused and hackneyed, but imperatives like improving student success in higher education are classic big data problems that data-mining and predictive analytics can address. Further, higher ed need to be producing a massive amount more data scientists and analysts than are currently in the pipeline, to further this discipline and application of these tools to many many other problems across multiple industries.

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >