Search Results

Search found 1020 results on 41 pages for 'recording'.

Page 36/41 | < Previous Page | 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • At the Java DEMOgrounds - Oracle Java ME Embedded Enables the “Internet of Things”

    - by Janice J. Heiss
    I caught up with Oracle’s Robert Barnes, Senior Director, Java Product Management, who was demonstrating a new product from Oracle’s Java Platform, Micro Edition (Java ME) product portfolio, Oracle Java ME Embedded 3.2, a complete client Java runtime optimized for microcontrollers and other resource-constrained devices. Oracle’s Java ME Embedded 3.2 is a Java ME runtime based on CLDC 1.1 (JSR-139) and IMP-NG (JSR-228).“What we are showing here is the Java ME Embedded 3.2 that we announced last week,” explained Barnes. “It’s the start of the 'Internet of Things,’ in which you have very very small devices that are on the edge of the network where the sensors sit. You often have a middle area called a gateway or a concentrator which is fairly middle to higher performance. On the back end you have a very high performance server. What this is showing is Java spanning all the way from the server side right down towards the type of chip that you will get at the sensor side as the network.” Barnes explained that he had two different demos running.The first, called the Solar Panel System Demo, measures the brightness of the light.  “This,” said Barnes, “is a light source demo with a Cortex M3 controlling the motor, on the end of which is a sensor which is measuring the brightness of the lamp. This is recording the data of the brightness of the lamp and as we move the lamp out of the way, we should be able using the server to turn the sensor towards the lamp so the brightness reading will go higher. This sends the message back to the server and we can look at the web server sitting on the PC underneath the desk. We can actually see the data being passed back effectively through a back office type of function within a utility environment.” The second demo, the Smart Grid Response Demo, Barnes explained, “has the same board and processor and is still using Java ME embedded with a different app on top. This is a demand response demo. What we are seeing within the managing environment is that people want to track the pricing signals of the electricity. If it’s particularly expensive at any point in time, they may turn something off. This demo sets the price of the electricity as though this is coming from the back of the server sending pricing signals to my home.” The demo had a lamp and a fan and it was tracking the price of electricity. “If I set the price of the electricity to go over 5 cents, then the device will turn off,” explained Barnes. “I can go into my settings and, in this case, change the price to 50 cents and we can wait a minus and the lamp will go off. When I change the pricing signal so that it is lower, the lamp will come back on. The key point is that the Java software we have running is the same across all the different devices; it’s a way to build applications across multiple devices using the same software. This is important because it fixes peak loading on the network and can stops blackouts.” This demo brought me back to a prior decade when Sun Microsystems first promoted  Jini technology, a version of Java that would put everything on the network and give us the smart home. Your home would be automated to tell you when you were out of milk, when to change your light bulbs, etc. You would have access to the web and the network throughout your home.It’s interesting to see how technology moves over time – from the smart home to the Internet of Things.

    Read the article

  • Give a session on C++ AMP – here is how

    - by Daniel Moth
    Ever since presenting on C++ AMP at the AMD Fusion conference in June, then the Gamefest conference in August, and the BUILD conference in September, I've had numerous requests about my material from folks that want to re-deliver the same session. The C++ AMP session I put together has evolved over the 3 presentations to its final form that I used at BUILD, so that is the one I recommend you base yours on. Please get the slides and the recording from channel9 (I'll refer to slide numbers below). This is how I've been presenting the C++ AMP session: Context (slide 3, 04:18-08:18) Start with a demo, on my dual-GPU machine. I've been using the N-Body sample (for VS 11 Developer Preview). (slide 4) Use an nvidia slide that has additional examples of performance improvements that customers enjoy with heterogeneous computing. (slide 5) Talk a bit about the differences today between CPU and GPU hardware, leading to the fact that these will continue to co-exist and that GPUs are great for data parallel algorithms, but not much else today. One is a jack of all trades and the other is a number cruncher. (slide 6) Use the APU example from amd, as one indication that the hardware space is still in motion, emphasizing that the C++ AMP solution is a data parallel API, not a GPU API. It has a future proof design for hardware we have yet to see. (slide 7) Provide more meta-data, as blogged about when I first introduced C++ AMP. Code (slide 9-11) Introduce C++ AMP coding with a simplistic array-addition algorithm – the slides speak for themselves. (slide 12-13) index<N>, extent<N>, and grid<N>. (Slide 14-16) array<T,N>, array_view<T,N> and comparison between them. (Slide 17) parallel_for_each. (slide 18, 21) restrict. (slide 19-20) actual restrictions of restrict(direct3d) – the slides speak for themselves. (slide 22) bring it altogether with a matrix multiplication example. (slide 23-24) accelerator, and accelerator_view. (slide 26-29) Introduce tiling incl. tiled matrix multiplication [tiling probably deserves a whole session instead of 6 minutes!]. IDE (slide 34,37) Briefly touch on the concurrency visualizer. It supports GPU profiling, but enhancements specific to C++ AMP we hope will come at the Beta timeframe, which is when I'll be spending more time talking about it. (slide 35-36, 51:54-59:16) Demonstrate the GPU debugging experience in VS 11. Summary (slide 39) Re-iterate some of the points of slide 7, and add the point that the C++ AMP spec will be open for other compiler vendors to implement, even on other platforms (in fact, Microsoft is actively working on that). (slide 40) Links to content – see slide – including where all your questions should go: http://social.msdn.microsoft.com/Forums/en/parallelcppnative/threads.   "But I don't have time for a full blown session, I only need 2 (or just 1, or 3) C++ AMP slides to use in my session on related topic X" If all you want is a small number of slides, you can take some from the session above and customize them. But because I am so nice, I have created some slides for you, including talking points in the notes section. Download them here. Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Dallas First Regionals 2012&ndash; For Inspiration and Recognition of Science and Technology

    - by T
    Wow!  That is all I have to say after the last 3 days. Three full fun filled days in a world that fed the geek, sparked the competitor, inspired the humanitarian, encouraged the inventor, and continuously warmed my heart.  As part of the Dallas First Regionals, I was awed by incredible students who teach as much as they learn, inventive and truly caring mentors that make mentoring look easy, and completely passionate and dedicated volunteers that bring meaning to giving all that you have and making events fun and safe for all.  If you have any interest in innovation, robotics, or highly motivated students, I can’t recommend anything any higher than visiting a First Robotics event. This is my third year with First and I was honored enough to serve the Dallas First Regionals as both a Web Site evaluator and as the East Field Volunteer Coordinator.  This was also the first year my daughter volunteered with me.  My daughter and I both recognize how different the First program is from other team events.  The difference with First is that everyone is a first class citizen.   It is a difference we can feel through experiencing and observing interactions between executives, respected engineers, students, and event staff.  Even with a veracious competition, you still find a lot of cooperation and we never witnessed any belittling between teams or individuals. First Robotics coined the term “Gracious Professionalism”.   It's a way of doing things that encourages high-quality work, emphasizes the value of others, and respects individuals and the community.1 I was introduced to this term as the Volunteer Coordinator when I was preparing the volunteer instructional speech.  Through the next few days, I discovered it is truly core to everything that First does.  One of the ways First accomplishes Gracious Professionalism is by utilizing another term they came up with which is “CoopertitionTM”. At FIRST, CoopertitionTM is displaying unqualified kindness and respect in the face of fierce competition.1   One of the things I never liked about sports was the need to “destroy” the other team.  First has really found a way to integrate CoopertitionTM  into the rules so teams can be competitive and part of that competition rewards cooperation. Oh and did I mention it has ROBOTS!!!  This year it had basket ball playing Kinect connected, remote controlled, robots!  There are not words for how exciting these games are.  You HAVE to check out this years game, Rebound Rumble, on youtube or you can view live action on Dallas First Video (as of this posting, the recording haven’t posted yet but should be there soon).  There are also some images below. Whatever it is, these students get it and exemplify it and these mentors ooze with it.  I am glad that First supports these events so we can all learn and be inspired by these exceptional students and mentors.  I know that no matter how much I give, it will never compare to what I gain volunteering at First.  Even if you don’t have time to volunteer, you owe it to yourself to go check out one of these events.  See what the future holds, be inspired, be encouraged, take some knowledge, leave some knowledge and most of all, have FUN.  That is what First is all about and thanks to First, I get it.   First Dallas Regionals 2012 VIEW SLIDE SHOW DOWNLOAD ALL 1.  USFirst http://www.usfirst.org/aboutus/gracious-professionalism

    Read the article

  • Openmatics Revolutionizes Fleet Management with Standards-Based Vehicle Telematics Platform

    - by Michael Snow
    Openmatics s.r.o. was founded in 2010 as a subsidiary of ZF Friedrichshafen AG, a global player in driveline and chassis technology. Oracle Customer:  Openmatics s.r.o.Location:  Pilsen, Czech RepublicIndustry:  AutomotiveEmployees:  70 Its goal was to develop and operate a flexible, open telematics platform for automotive applications, which is independent from vehicle and component suppliers—recognizing that the fragmented telematics market was not meeting today’s fleet management needs. Openmatics provides a rich product portfolio, and customers can extend the platform, as required, to meet their needs. Partners and third-parties can develop their own applications using the Openmatics’ software development kit and can sell them via the Openmatics app shop.ZF Friedrichshafen AG is a global player in driveline and chassis technology. With 121 production companies and 650 service partners in 26 countries, ZF is among the top 10 largest automotive suppliers worldwide. Founded in 1915 to develop and produce transmissions for airships and vehicles, the group’s product offerings now include transmissions and steering systems as well as chassis components and complete axle systems and modules.  A word from Openmatics s.r.o.  “Oracle WebCenter Portal, together with the underlying Oracle Application Development Framework, provided the fundamental infrastructure for the Openmatics platform. Fleet managers can now reduce fuel consumption and operating costs, and more efficiently manage vehicle usage, maintenance, and safety. The standards-based platform allows third-party suppliers to deploy their own vehicle telematics services as Openmatics apps and creates a de facto standard for the automotive industry, independent from a single manufacturer or service provider.” – Gero Strobel, Head of Development, Openmatics s.r.o. Challenges Create an industry standard for vehicle telematics by establishing a customizable platform that enables access to telematics information, such as current and past fuel consumption, through a web browser to better meet automotive market and customer needs Reduce fleet-management costs by eliminating the need to invest in isolated telematics hardware and software solutions per vehicle brand and vehicle component manufacturer Establish an open platform where third-party providers—such as original equipment manufacturers (OEM), insurers, fleet operators, and individual developers—can deploy their own vehicle telematics services Allow users to purchase targeted telematics services as single apps to reduce costs and ensure rapid growth of telematics services available on the platform Enable users to configure their telematics apps with ease to make sure the platform meets individual fleet management requirements, such as analyzing past and current fuel consumption of a truck fleet Solutions Deployed Oracle WebCenter Portal as a foundation for Openmatics, a standards-based automotive telematics platform that provides next-generation fleet management with unified digital communication from and to vehicles on the move Used Oracle Application Development Framework as the development framework for Oracle WebCenter Portal’s components and services, providing developers with ready-to-use software development kits with application programming interfaces, design templates, and visual tools that accelerated time to market Used Oracle Enterprise Pack for Eclipse to simplify telematics application development in Java Enabled fleet monitoring by recording vehicle data—such as fuel consumption information—through onboard units, delivering the information to Oracle Database, and making it accessible through a customizable app portfolio on any web browser Stored vehicle telematics data—sent as encrypted information—in Oracle Database, ensuring data integrity and immediate availability for the platform’s telematics applications Enabled a wide range of telematics services suppliers, from vehicle component manufacturers to fleet application developers, to offer vehicle telematics services on the Openmatics platform, ensuring platform independence from OEMs Provided Openmatics customers with the means to individually select the automotive telematics services that are relevant to their business requirements, eliminating the need to pay for superfluous information and reducing fleet management costs Oracle Products & Services Oracle Application Development Framework Oracle WebCenter Portal Oracle SOA Suite Oracle Enterprise Pack for Eclipse Oracle Database Oracle Consulting &amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;gt;amp;&amp;amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;gt;lt;p&amp;amp;amp;amp;amp;amp;amp;amp;gt; &amp;amp;amp;amp;amp;amp;amp;amp;lt;/p&amp;amp;amp;amp;amp;amp;amp;amp;gt;

    Read the article

  • Getting Started with Cloud Computing

    - by juanlarios
    You’ve likely heard about how Office 365 and Windows Intune are great applications to get you started with Cloud Computing. Many of you emailed me asking for more info on what Cloud Computing is, including the distinction between "Public Cloud" and "Private Cloud". I want to address these questions and help you get started. Let's begin with a brief set of definitions and some places to find more info; however, an excellent place where you can always learn more about Cloud Computing is the Microsoft Virtual Academy. Public Cloud computing means that the infrastructure to run and manage the applications users are taking advantage of is run by someone else and not you. In other words, you do not buy the hardware or software to run your email or other services being used in your organization – that is done by someone else. Users simply connect to these services from their computers and you pay a monthly subscription fee for each user that is taking advantage of the service. Examples of Public Cloud services include Office 365, Windows Intune, Microsoft Dynamics CRM Online, Hotmail, and others. Private Cloud computing generally means that the hardware and software to run services used by your organization is run on your premises, with the ability for business groups to self-provision the services they need based on rules established by the IT department. Generally, Private Cloud implementations today are found in larger organizations but they are also viable for small and medium-sized businesses since they generally allow an automation of services and reduction in IT workloads when properly implemented. Having the right management tools, like System Center 2012, to implement and operate Private Cloud is important in order to be successful. So – how do you get started? The first step is to determine what makes the most sense to your organization. The nice thing is that you do not need to pick Public or Private Cloud – you can use elements of both where it makes sense for your business – the choice is yours. When you are ready to try and purchase Public Cloud technologies, the Microsoft Volume Licensing web site is a good place to find links to each of the online services. In particular, if you are interested in a trial for each service, you can visit the following pages: Office 365, CRM Online, Windows Intune, and Windows Azure. For Private Cloud technologies, start with some of the courses on Microsoft Virtual Academy and then download and install the Microsoft Private Cloud technologies including Windows Server 2008 R2 Hyper-V and System Center 2012 in your own environment and take it for a spin. Also, keep up to date with the Canadian IT Pro blog to learn about events Microsoft is delivering such as the IT Virtualization Boot Camps and more to get you started with these technologies hands on. Finally, I want to ask for your help to allow the team at Microsoft to continue to provide you what you need. Twice a year through something we call "The Global Relationship Study" – they reach out and contact you to see how they're doing and what Microsoft could do better. If you get an email from "Microsoft Feedback" with the subject line "Help Microsoft Focus on Customers and Partners" between March 5th and April 13th, please take a little time to tell them what you think. Cloud Computing Resources: Microsoft Server and Cloud Computing site – information on Microsoft's overall cloud strategy and products. Microsoft Virtual Academy – for free online training to help improve your IT skillset. Office 365 Trial/Info page – get more information or try it out for yourself. Office 365 Videos – see how businesses like yours have used Office 365 to transition to the cloud. Windows Intune Trial/Info – get more information or try it out for yourself. Microsoft Dynamics CRM Online page – information on trying and licensing Microsoft Dynamics CRM Online. Additional Resources You May Find Useful: Springboard Series Your destination for technical resources, free tools and expert guidance to ease the deployment and management of your Windows-based client infrastructure. TechNet Evaluation Center Try some of our latest Microsoft products for free, Like System Center 2012 Pre-Release Products, and evaluate them before you buy. AlignIT Manager Tech Talk Series A monthly streamed video series with a range of topics for both infrastructure and development managers. Ask questions and participate real-time or watch the on-demand recording. Tech·Days Online Discover what's next in technology and innovation with Tech·Days session recordings, hands-on labs and Tech·Days TV.

    Read the article

  • Profiling Startup Of VS2012 &ndash; JustTrace Profiler

    - by Alois Kraus
    JustTrace is made by Telerik which is mainly known for its collection of UI controls. The current version (2012.3.1127.0) does include a performance and memory profiler which does cost 614€ and is currently with a special offer for 306€ on sale. It does include one year of free upgrades. The uneven € numbers are calculated from the 799€ and 50% dicsount price. The UI is already in Metro style and simple to use. Multi process, attach, method recording filter are not supported. It looks like JustTrace is like Ants a Just My Code profiler. For stuff where you do not have the pdbs or you want to dig deeper into the BCL code you will not get far. After getting the profile data you get in the All Methods grid a plain list with hit count and own time. The method list for all methods is also suspiciously short which is a clear sign that you will not get far during the analysis of foreign code. But at least there is also a memory profiler included. For this I have to choose in the first window for Profiling Type “Memory Profiler” to check the memory consumption of VS.  There are some interesting number to see but I do really miss from YourKit the thread stack window. How am I supposed to get a clue when much memory is allocated and the CPU consumption is high in which places I should look? The Snapshot summary gives a rough overview which is ok for a first impression. Next is Assemblies? This gives you a list of all loaded assemblies. Not terribly useful.   The By Type view gives you exactly what it is supposed to do. You have to keep in mind that this list is filtered by the types you did check in the Assemblies list. The By Type instance list does only show types from assemblies which do not originate from Microsoft. By default mscorlib and System are not checked. That is the reason why for the first time my By Type window looked like The idea behind this feature is to show only your instances because you are ultimately responsible for the overall memory consumption. I am not sure if I do like this feature because by default it does hide too much. I do want to see at least how many strings and arrays are allocated. A simple namespace filter would also do it in my opinion. Now you can examine all string instances and look who in the object graph does keep a reference on them. That is nice but YourKit has the big plus that you can also look into the string contents.  I am also not sure how in the graph cycles are visualized and what will happen if you have thousands of objects referencing you. That's pretty much it about JustTrace. It can help the average developer to pinpoint performance and memory issues by just looking at his own code and instances. Showing them more will not help them because the sheer amount of information will overwhelm them. And you need to have a pretty good understanding how the GC and the CLR does work. When you have a performance issue at a customer machine it is sometimes very helpful to be able a bring a profiler onto the machine (no pdbs, …) and to get a full snapshot of all processes which are in the problematic use case involved. For these more advanced use cased JustTrace is certainly the wrong tool. Next: SpeedTrace

    Read the article

  • SCons does not clean all files

    - by meowsqueak
    I have a file system containing directories of "builds", each of which contains a file called "build-info.xml". However some of the builds happened before the build script generated "build-info.xml" so in that case I have a somewhat non-trivial SCons SConstruct that is used to generate a skeleton build-info.xml so that it can be used as a dependency for further rules. I.e.: for each directory: if build-info.xml already exists, do nothing. More importantly, do not remove it on a 'scons --clean'. if build-info.xml does not exist, generate a skeleton one instead - build-info.xml has no dependencies on any other files - the skeleton is essentially minimal defaults. during a --clean, remove build-info.xml if it was generated, otherwise leave it be. My SConstruct looks something like this: def generate_actions_BuildInfoXML(source, target, env, for_signature): cmd = "python '%s/bin/create-build-info-xml.py' --version $VERSION --path . --output ${TARGET.file}" % (Dir('#').abspath,) return cmd bld = Builder(generator = generate_actions_BuildInfoXML, chdir = 1) env.Append(BUILDERS = { "BuildInfoXML" : bld }) ... # VERSION = some arbitrary string, not important here # path = filesystem path, set elsewhere build_info_xml = "%s/build-info.xml" % (path,) if not os.path.exists(build_info_xml): env.BuildInfoXML(build_info_xml, None, VERSION = build) My problem is that 'scons --clean' does not remove the generated build-info.xml files. I played around with env.Clean(t, build_info_xml) within the 'if' but I was unable to get this to work - mainly because I could not work out what to assign to 't' - I want a generated build-info.xml to be cleaned unconditionally, rather than based on the cleaning of another target, and I wasn't able to get this to work. If I tried a simple env.Clean(None, "build_info_xml") after but outside the 'if' I found that SCons would clean every single build-info.xml file including those that weren't generated. Not good either. What I'd like to know is how SCons goes about determining which files should be cleaned and which should not. Is there something funny about the way I've used a generator function that prevents SCons from recording this target as a Clean candidate?

    Read the article

  • LoaderLock was detected, and turning off the warning does not help

    - by Scott M.
    I am trying to write an application that takes in sound from the default audio recording device on a computer. When running any code that accesses DirectX from my managed code i get this error: DLL 'C:\Windows\assembly\GAC\Microsoft.DirectX.DirectSound\1.0.2902.0__31bf3856ad364e35\Microsoft.DirectX.DirectSound.dll' is attempting managed execution inside OS Loader lock. Do not attempt to run managed code inside a DllMain or image initialization function since doing so can cause the application to hang. DevicesCollection coll = new DevicesCollection(); and Device d = new Device(DSoundHelper.DefaultCaptureDevice); and Capture c = new Capture(DSoundHelper.DefaultCaptureDevice); all cause the LoaderLock MDA to pop up and tell me there is a problem. I have scoured the internet (stackoverflow included) for solutions to this problem, but most people just say to turn off the warning, which does not work. When I turn off the warning, a generic ApplicationException is thrown, which is even less useful. I have seen the answers to this question as well, which didn't help because he said to remove the code that is causing the error. Others have said "fix your code." My questions are: how can I call any (preferably managed) DirectX code from C# without getting this error?

    Read the article

  • How do I configure CI hudson with PHPunit and how do I run phpunit using hudson?

    - by Vinesh
    Hi, I am getting following error mentioned below. Help is much needed for this... kindly go through the errors. Started by an SCM change Updating https://suppliesguys.unfuddle.com/svn/suppliesguys_frontend2/Frontend-Texity/src U sites\all\modules\print\print_pdf\print_pdf.pages.inc At revision 1134 [workspace] $ sh -xe C:\WINDOWS\TEMP\hudson6292587174545072503.sh The system cannot find the file specified FATAL: command execution failed java.io.IOException: Cannot run program "sh" (in directory "E:\Projects\Hudson.hudson\jobs\TSG\workspace"): CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessBuilder.start(Unknown Source) at hudson.Proc$LocalProc.(Proc.java:149) at hudson.Proc$LocalProc.(Proc.java:121) at hudson.Launcher$LocalLauncher.launch(Launcher.java:636) at hudson.Launcher$ProcStarter.start(Launcher.java:271) at hudson.Launcher$ProcStarter.join(Launcher.java:278) at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:83) at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:58) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19) at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:584) at hudson.model.Build$RunnerImpl.build(Build.java:174) at hudson.model.Build$RunnerImpl.doRun(Build.java:138) at hudson.model.AbstractBuild$AbstractRunner.run(AbstractBuild.java:416) at hudson.model.Run.run(Run.java:1244) at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46) at hudson.model.ResourceController.execute(ResourceController.java:88) at hudson.model.Executor.run(Executor.java:122) Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified at java.lang.ProcessImpl.create(Native Method) at java.lang.ProcessImpl.(Unknown Source) at java.lang.ProcessImpl.start(Unknown Source) ... 17 more Publishing Javadoc Publishing Clover coverage report... No Clover report will be published due to a Build Failure [xUnit] Starting to record. [xUnit] [PHPUnit] - Use the embedded style sheet. [xUnit] [ERROR] - No test report file(s) were found with the pattern 'build/logs/phpunit.xml' relative to 'E:\Projects\Hudson.hudson\jobs\TSG\workspace' for the testing framework 'PHPUnit'. Did you enter a pattern relative to the correct directory? Did you generate the result report(s) for 'PHPUnit'? [xUnit] Stopping recording. Finished: FAILURE

    Read the article

  • Configuring xUnit test output in Hudson

    - by graham.reeds
    I have a simple PoC project in Hudson. The PoC has unit tests written via UnitTest++ and outputs the results as XML for consumption by xUnit to munge into jUnit format. Here are the salient relevant I have my project configured to use MSBuild to build the 2008 solution. The project contains both the dll it is to build and the unit tests which are run as a post-build step. My workspace in Hudson is set to c:\develop\money (Money is the name of the project) and in the Hudson console I can see the workspace folders, the solution file and output folders (/bin, /doc, etc). The test console app outputs its file 'money_unit_tests.xml' to the folder 'reports' (making c:\develop\money\reports). I've restarted Hudson since installing xUnit and setting the workspace. However when I start Hudson building I am given the following message: [xUnit] Starting to record. [xUnit] [UnitTest] - Use the embedded style sheet. [xUnit] [ERROR] - No test report file(s) were found with the pattern 'reports/money_unit_tests.xml' relative to 'C:\.hudson\jobs\Money\workspace' for the testing framework 'UnitTest'. Did you enter a pattern relative to the correct directory? Did you generate the result report(s) for 'UnitTest'? [xUnit] Stopping recording. Finished: FAILURE Why does Hudson seem to think the workspace is in C:.hudson... and not C:\Develop...? What can I do change it? If I can't change it, what can I do to mitigate these changes? (I don't exactly want to hardcode the output for the xml to C:.hudson...)

    Read the article

  • "Parameter type conflict" when calling Java Stored Procedure within another Java Stored Procedure

    - by GuiPereira
    Here's the problem (sorry for the bad english): i'm working with JDeveloper and Oracle10g, and i have a Java Stored Procedure that is calling another JSP like the code: int sd = 0; try { CallableStatement clstAddRel = conn.prepareCall(" {call FC_RJS_INCLUIR_RELACAO_PRODCAT(?,?)} "); clstAddRel.registerOutParameter(1, Types.INTEGER); clstAddRel.setString(1, Integer.toString(id_produto_interno)); clstAddRel.setString(2, ac[i].toString()); clstAddRel.execute(); sd = clstAddRel.getInt(1); } catch(SQLException e) { String sqlTeste3 = "insert into ateste values (SQ_ATESTE.nextval, ?)"; PreparedStatement pstTeste3 = conn.prepareStatement(sqlTeste3); pstTeste3.setString(1,"erro: "+e.getMessage()+ ac[i]); pstTeste3.execute(); pstTeste3.close(); } I'm recording the error in a table called ATESTE because this JavaSP is a procedure and not a function, I've to manipulate DML inside. So, the error message I'm getting is: 'parameter type conflict'... the function "FC_RJS_INCLUIR_RELACAO_PRODCAT" it's a Java Stored Procedure too, it's already exported to Oracle, and returns an int variable, and i have to read this to decide which webservice i will call from this JavaSP. I have already tried the OracleTyep.NUMBER in the registerOutParameter. Anyone knows what i'm doing wrong?

    Read the article

  • iPhone Development - CLLocationManager vs. MapKit

    - by Mustafa
    If i want to show userLocation on the map, and at the same time record the user's location, is it a good idea to add an observer to userLocation.location and record the locations, OR should i still use CLLocationManager for recording user location and use mapView.showUserLocation to show the user's current location (blue indicator)? I want to show the default blue indicator supported by the MapKit API. Also, here's a rough sample code: - (void)viewDidLoad { ... locationManager = [[CLLocationManager alloc] init]; locationManager.desiredAccuracy = kCLLocationAccuracyBest; locationManager.distanceFilter = DISTANCE_FILTER_VALUE; locationManager.delegate = self; [locationManager startUpdatingLocation]; myMapView.showUserLocation = YES; [myMapView addObserver:self forKeyPath:@"userLocation.location" options:0 context:nil]; ... } - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context { // Record the location information // ... } - (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation { NSLog(@"%s begins.", __FUNCTION__); // Make sure that the location returned has the desired accuracy if (newLocation.horizontalAccuracy <= manager.desiredAccuracy) return; // Record the location information // ... } Under the hood, i think MKMapView also uses CLLocationManager to get user's current location? So, will this create any problems because i believe both CLLocationManager and MapView will try to use same location services? Will there be any conflicts and lack of accurate/required or current data?

    Read the article

  • Can AVAudioSession do full duplex?

    - by Eric Christensen
    It would seem like it should be able to, but the following breakout test code can't do both: //play a file: NSArray *pathsArray = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [pathsArray objectAtIndex:0]; NSString* playFilePath=[documentsDirectory stringByAppendingPathComponent:@"testplayfile.wav"]; AVAudioPlayer *tempplayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:playFilePath] error:nil]; [tempplayer prepareToPlay]; [tempplayer play]; //and record a file: NSString* recFilePath=[documentsDirectory stringByAppendingPathComponent:@"testrecordfile.wav"]; AVAudioRecording *soundrecording = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:recFilePath] settings:nil error:nil]; [soundrecording prepareToRecord]; [soundrecording record]; This is the minimum I can think of to individually play one file and record another. And this works just fine in the simulator. I can play back a file and record at the same time. But it doesn't work on the iphone itself. If I comment out either function, the other performs fine. The playback plays fine either alone or with both, if it's first. If I comment out the playback, the record records fine. (There's additional code to stop the recording not shown here.) So each works fine, but not together. I know audioQueue has a setting to allow both, but I don't see an analogue for AVAudioSessions. Any idea if it's possible, and if so, what I need to add? Thanks!

    Read the article

  • EXC_BAD_ACCESS NSUrlConnection

    - by Lars
    Hi all, i got an EXC_BAD_ACCESS when i perform the last line of the function (webData). -(void)requestSoap{ NSString *requestUrl = @"http://www.website.com/webservice.php"; NSString *soapMessage = @"the soap message"; //website and soapmessage are valid in original code. NSError **error; NSURLResponse *response; //Convert parameter string to url NSURL *url = [NSURL URLWithString:requestUrl]; NSMutableURLRequest *theRequest = [NSMutableURLRequest requestWithURL:url cachePolicy:NSURLRequestReloadIgnoringCacheData timeoutInterval:10]; NSString *msgLength = [NSString stringWithFormat:@"%d", [soapMessage length]]; //Create an XML message for webservice [theRequest addValue: @"text/xml; charset=utf-8" forHTTPHeaderField:@"Content-Type"]; [theRequest addValue: msgLength forHTTPHeaderField:@"Content-Length"]; [theRequest setHTTPMethod:@"POST"]; [theRequest setHTTPBody: [soapMessage dataUsingEncoding:NSUTF8StringEncoding]]; NSData *webData = [NSURLConnection sendSynchronousRequest:theRequest returningResponse:&response error:error]; } I tried not to release a thing, because what i read on the net is it's almost always a memory thing. When i debug the code (NSZombieEnabled = YES) this is what i get: [Session started at 2010-05-31 15:56:13 +0200.] GNU gdb 6.3.50-20050815 (Apple version gdb-1461.2) (Fri Mar 5 04:43:10 UTC 2010) Copyright 2004 Free Software Foundation, Inc. GDB is free software, covered by the GNU General Public License, and you are welcome to change it and/or distribute copies of it under certain conditions. Type "show copying" to see the conditions. There is absolutely no warranty for GDB. Type "show warranty" for details. This GDB was configured as "x86_64-apple-darwin".sharedlibrary apply-load-rules all Attaching to process 19856. test(19856) malloc: recording malloc stacks to disk using standard recorder test(19856) malloc: enabling scribbling to detect mods to free blocks test(19856) malloc: process 19832 no longer exists, stack logs deleted from /tmp/stack-logs.19832.test.w9Ek4L.index test(19856) malloc: stack logs being written into /tmp/stack-logs.19856.test.URRpQF.index Program received signal: “EXC_BAD_ACCESS”. Does anybody have a clue?? Thanks a lot! Lars

    Read the article

  • multi thread apps crashes in release mode

    - by etzarfat
    Hello, I'm using Visual Studio 2008 (programming in c). I've a weird problem I worte a program that has 2 threads that runs simultaneously, a recording thread (using audio card to record into memory) and a translation thread (using a speech engine to recognize the words). when I run my program in debug mode (aka setting a breakpoint in the code) it runs great, however when I run in debug mode or release mode (outside the visual studio enviroment) it crashes and give me the following exception: "Unhandled exception at 0x7c911129 in LowLevel.exe: 0xC0000005: Access violation reading location 0x014c7245." My stack looks: LowLevel.exe!__set_flsgetvalue() Line 256 + 0xc bytes C LowLevel.exe!_isleadbyte_l(int c=4359676, localeinfo_struct * plocinfo=0x00000001) Line 57 C++ 014b00d8() LowLevel.exe!PlayDateOfExam(int option=1) Line 2240 + 0x7 bytes C++ LowLevel.exe!NSCThread(void * arg=0x00000000) Line 1585 + 0xb bytes C++ kernel32.dll!7c80b729() winmm.dll!76b5b294() I uses the following file in my project "nsc.lib" and WinMM.lib" I'm not really familiar with threads I used a sample (which works great) and built on it. I saw a similiar question year on the forum but I didn't really understand the answers since I'm not familiar with with threads. Can someone help me? Thanks

    Read the article

  • C++ from SpeakHere in iPhone app

    - by niklassaers
    Hi guys, I've made a template app where I've grabbed the recording part of the SpeakHere example and removed the file handling part, but I'm struggeling to get the C++ part of the app working right. As soon as it enters the C++ class, it gets syntax errors. If I don't import the header files from C++ (and then of course don't use the code) into my Objective C classes, all works fine. I cannot see the difference between how I'm doing it and the example is doing it. Can you see the difference? I've posted the entire code here: http://github.com/niklassaers/testFFT The build errors I get are: testFFT/CAStreamBasicDescription.h:91:0 testFFT/CAStreamBasicDescription.h:91: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'CAStreamBasicDescription' testFFT/CAStreamBasicDescription.h:298:0 testFFT/CAStreamBasicDescription.h:298: error: expected '=', ',', ';', 'asm' or '__attribute__' before '<' token testFFT/CAStreamBasicDescription.h:299:0 testFFT/CAStreamBasicDescription.h:299: error: expected '=', ',', ';', 'asm' or '__attribute__' before '==' token testFFT/CAStreamBasicDescription.h:301:0 testFFT/CAStreamBasicDescription.h:301: error: expected '=', ',', ';', 'asm' or '__attribute__' before '!=' token testFFT/CAStreamBasicDescription.h:302:0 testFFT/CAStreamBasicDescription.h:302: error: expected '=', ',', ';', 'asm' or '__attribute__' before '<=' token testFFT/CAStreamBasicDescription.h:303:0 testFFT/CAStreamBasicDescription.h:303: error: expected '=', ',', ';', 'asm' or '__attribute__' before '>=' token testFFT/CAStreamBasicDescription.h:304:0 testFFT/CAStreamBasicDescription.h:304: error: expected '=', ',', ';', 'asm' or '__attribute__' before '>' token testFFT/CAStreamBasicDescription.h:307:0 testFFT/CAStreamBasicDescription.h:307: error: expected ';', ',' or ')' before '&' token testFFT/CAXException.h:65:0 testFFT/CAXException.h:65: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'CAX4CCString' testFFT/CAXException.h:87:0 testFFT/CAXException.h:87: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'CAXException' testFFT/AQRecorder.h:59:0 testFFT/AQRecorder.h:59: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'AQRecorder' testFFT/RecorderLink.h:57:0 testFFT/RecorderLink.h:57: error: expected specifier-qualifier-list before 'AQRecorder' testFFT/RecorderLink.h:62:0 testFFT/RecorderLink.h:62: error: expected specifier-qualifier-list before 'AQRecorder' Any idea what's going on here? Cheers Nik

    Read the article

  • Multimedia files written over WAN are getting truncated

    - by Dean
    I use the windows Multimedia API to create .wav files. 1. Open file with mmsioOpen 2. Creates WAVE,frm and data chunks using mmioCreateChunk 3. Write audio data using mmioWrite 4. Ascend out of the chunks using mmioAscend 5. Close file using mmioClose The file is being written into a temporary location, so after it has been closed it gets copied to another location using the CopyFile. This program is written in C++ and works great until the file it is writing resides over a WAN in a different city or country. The end result is a wav file that should be 20-30 seconds long ends up being 4 secodns long. It is always the last bit that is missing, so when you play it back it just stops before then of the recording. I initially thought that maybe I was copying the file too soon so as a test I put in a pause of 30 seconds after closing the file using Sleep(30000), but this made no difference to either it being truncated or by how much. I have modified the program to write to a file in parrallel using CreateFile and WriteFile, and the result is the same, so it is not an issue specifically with the mmio API's. Does anyone have any ideas why this is happening and if there is a work-around to it? I suspect that I may end up having the temporary location on the local drive, but this is quite a big change to the application as well as existing deployments. thanks for everyones time Dean

    Read the article

  • Audio stream mangement in Linux

    - by User1
    I have a very complicated audio setup for a project. Here's what we have: 3 applications playing sound 2 applications recording sound 2 sound cards I really don't really have the code to any of these applications. All I want to do is monitor and control the audio streams. Here are a few examples of operations I'd like to do while the applications are running: Mute one of the incoming audio streams. Have one of the incoming audio streams do a "solo" (be the only stream that can "talk"). Get a graph (about 30 seconds worth) of the audio that each stream produced. Send one of the audio streams to soundcard #1, but all three audio streams to soundcard #2. I would likely switch audio streams every 2 minutes or so with one of the operations listed above. A GUI would be preferred. I started looking at the sound systems in Linux and it gets extremely complex and I feel like there have been many new advances in the past few years. I see jack, pulseaudio, artsd, and several other packages. They all have some promise but where should I start? Is there something someone already built that can help?

    Read the article

  • Using PHP to store results of a post request

    - by Paul M
    Im currently working with an API which requires we send our collection details in xml to their server using a post request. Nothing major there but its not working, so I want to output the sent xml to a txt file so I can look at actually whats being sent!! Instead of posting to the API im posting to a document called target, but the xml its outputting its recording seems to be really wrong. Here is my target script, note that the posting script posts 3 items, so the file being written should have details of each post request one after the other. <?php error_reporting(E_ALL); ini_set('display_errors', 1); // get the request data... $payload = ''; $fp = fopen('php://input','r'); $output_file = fopen('output.txt', 'w'); while (!feof($fp)) { $payload .= fgets($fp); fwrite($output_file, $payload); } fclose($fp); fclose($output_file); ?> I also tried the following, but this just recorded the last post request so only 1 collection item was recorded in the txt file, instead of all 3 output_file = fopen('output.txt', 'w'); while (!feof($fp)) { $payload .= fgets($fp); } fwrite($output_file, $payload); fclose($fp); fclose($output_file); I know im missing something really obvious, but ive been looking at this all morning!

    Read the article

  • AVAudioPlayer working in Simulator, but not on device

    - by cannyboy
    My mp3 playing code is: NSError *error; soundObject = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:audioPathString] error:&error]; if (soundObject == nil) NSLog(@"%@", [error description]); soundObject.delegate = self; soundObject.numberOfLoops = 0; soundObject.volume = 1.0; NSLog(@"about to play"); [soundObject prepareToPlay]; [soundObject play]; NSLog(@"[soundObject play];"); The mp3 used to play fine, and it still does on the simulator. But not on the device. I've recently added some sound recording code (not mine) to the software. It uses AudioQueue stuff which is slightly beyond me. Does that conflict with AVAudioPlayer? Or what could be the problem? I've noticed that as soon as the audiorecording code starts working, I can't adjust the volume on the device anymore, so maybe it blocks the audio playback?. EDIT The solution seems to be to put this in my code. I put it in applicationDidFinishLaunching: [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error: nil]; UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker; AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride); The first line allows both play and record, whilst the other lines apparently reroute things to make the volume louder. All audio code is voodoo to me.

    Read the article

  • MediaRecorder Prepare Failed

    - by AndyTang
    Hi, I'm new here. I have been trying to create a video capture app using the android emulator without much success. As far as I know and looking through all the samples and code on the internet (this site and others), I must still be missing a step. I've tried using this sample near the end of this thread made by JonPro: http://www.anddev.org/viewtopic.php?p=24723#24723 and I've tried making my own but the media recorder would always fail on the prepare stage with the most unhelpful message of 'prepare failed'. I have no clue what I am missing. I seem to have the correct permissions and a SDCard is mounted according to the emulator. Should I be using a android SDK version other than 2.1? Even though the code in that forum claims to work, I figured out that this line was missing: recorder.setCamera(camera); But still no joy as the logs shows that: 'Failed to get camera(0x16b70) parameters' when prepare() is called but it still doesn't make sense as the preview is okay, but no recording! Any help or suggestions will be appreciated.

    Read the article

  • compile ICS/JB camera application - native library jni-mosaic error

    - by liorry
    I would like to use the Panorama mode that the ICS/JB camera application has. I've downloaded the source code (with resources) and managed to solve all code compilation errors but as soon as I start the application on my device (running JB), I get this error: 10-25 14:42:53.617: E/AndroidRuntime(23147): FATAL EXCEPTION: GLThread 2586 10-25 14:42:53.617: E/AndroidRuntime(23147): java.lang.UnsatisfiedLinkError: Native method not found: com.app.camera.panorama.MosaicRenderer.reset:(IIZ)V 10-25 14:42:53.617: E/AndroidRuntime(23147): at com.app.camera.panorama.MosaicRenderer.reset(Native Method) 10-25 14:42:53.617: E/AndroidRuntime(23147): at com.app.camera.panorama.MosaicRendererSurfaceViewRenderer.onSurfaceChanged(MosaicRendererSurfaceViewRenderer.java:49) 10-25 14:42:53.617: E/AndroidRuntime(23147): at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1505) 10-25 14:42:53.617: E/AndroidRuntime(23147): at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240) I do have a libjni-mosaic lib, located in armeabi-v7a/armeabi/x86 and it seems to load it fine but it probably doesn't contain the methods the MosaicRenderer implements. I also tried compiling the CyanogenMod camera app https://github.com/CyanogenMod/android_packages_apps_Camera/tree/ics but I get the same error... The camera itself works, for stills and video recording but as soon as I change to panorama mode, it crashes. Can anyone maybe point me to the right jni-mosaic lib or maybe to what I'm doing wrong? Do I need to do something in order to make my app use the JNI/SO files?

    Read the article

  • Core Audio on iPhone - any way to change the microphone gain (either for speakerphone mic or headpho

    - by Halle
    After much searching the answer seems to be no, but I thought I'd ask here before giving up. For a project I'm working on that includes recording sound, the input levels sound a little quiet both when the route is external mic + speaker and when it's headphone mic + headphones. Does anyone know definitively whether it is possible to programmatically change mic gain levels on the iPhone in any part of Core Audio? If not, is it possible that I'm not really in "speakerphone" mode (with the external mic at least) but only think I am? Here is my audio session init code: OSStatus error = AudioSessionInitialize(NULL, NULL, audioQueueHelperInterruptionListener, r); [...some error checking of the OSStatus...] UInt32 category = kAudioSessionCategory_PlayAndRecord; // need to play out the speaker at full volume too so it is necessary to change default route below error = AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(category), &category); if (error) printf("couldn't set audio category!"); UInt32 doChangeDefaultRoute = 1; error = AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryDefaultToSpeaker, sizeof (doChangeDefaultRoute), &doChangeDefaultRoute); if (error) printf("couldn't change default route!"); error = AudioSessionAddPropertyListener(kAudioSessionProperty_AudioRouteChange, audioQueueHelperPropListener, r); if (error) printf("ERROR ADDING AUDIO SESSION PROP LISTENER! %d\n", (int)error); UInt32 inputAvailable = 0; UInt32 size = sizeof(inputAvailable); error = AudioSessionGetProperty(kAudioSessionProperty_AudioInputAvailable, &size, &inputAvailable); if (error) printf("ERROR GETTING INPUT AVAILABILITY! %d\n", (int)error); error = AudioSessionAddPropertyListener(kAudioSessionProperty_AudioInputAvailable, audioQueueHelperPropListener, r); if (error) printf("ERROR ADDING AUDIO SESSION PROP LISTENER! %d\n", (int)error); error = AudioSessionSetActive(true); if (error) printf("AudioSessionSetActive (true) failed"); Thanks very much for any pointers.

    Read the article

  • trackPageView on Google Analytics for iPhone Not Working

    - by DigitalZombieKid
    I'm trying to get Google Analytics working on an iPhone application without much luck. I've followed all the instructions on their website (google/apis/analytics/docs/tracking/mobileAppsTracking.html) and studied their sample application (google/gaformobileapps/GoogleAnalyticsIphone_0.7.tar.gz). When I run my application and go to Google Analytics' website (https://www.google.com/analytics/reporting/), the only page that is recording is /app_entry_point. I'm seeing one count in my Google Analytics detailed report once every time my app fires up. However, I have added other pages to be tracked but it's not working. Here is a sample of two pages I've added to be tracked: trackPageview:@"/calculator"; trackPageview:@"/tellafriend"; I call them from various ViewControllers in the app. In each of those view controllers I import the the GANTracker header: #import "GANTracker.h" I'll admit it: I'm an objective-c newbie. Any help you can offer is greatly appreciated! Do I need to physically dispatch them to get the trackPageview working? If so, why is the /app_entry_point page the only page that is recorded by Google Analytics?

    Read the article

  • String to array or Array to string tips on formats, etc

    - by user316841
    hi, first of all thanks for taking your time! I'm a junior Dev, working with PHP + mysql. My issue: I'm saving data from a form to my database. From this form, there's only need to save the contacts: Name, phone number, address. But, it would be nice to have a small reference to the user answers. Let's say for each question we've got a value betwee 1 and 4. Since there's no need to create a table just for it, because what's needed is just the personal contacts. I'm thinking of recording each question/answer, as a letter and its correspondent value. Example (A2, B1, C5, D3, etc). My question is: Is there a format I could afterwards, handle easily ? Convert to array (string to array) in case the client change ideas, and ask this data, placed in table columns ? Just to prevent this situation! Example, From (A2, B1, C5 ) to array( "A" = "1", "B" = "1", "C" = "5" ) For now I guess, Regex is the answer, but it's allways hard to figure it out and I'm allways getting in troubles =) Thanks!

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41  | Next Page >