Search Results

Search found 33965 results on 1359 pages for 'oracle upk content'.

Page 673/1359 | < Previous Page | 669 670 671 672 673 674 675 676 677 678 679 680  | Next Page >

  • JMX Monitoring of GlassFish Servers

    - by tjquinn
    Did you ever wonder what this message in your GlassFish server.log file means? JMXStartupService has started JMXConnector on JMXService URL service:jmx:rmi://192.168.2.102:8686/jndi/rmi://192.168.2.102:8686/jmxrmi It means you can monitor any GlassFish server process, remotely or locally, using any standard Java Management Extensions (JMX) client.  Examples: jconsole or jvisualvm.   Copy the part of the log message that starts with "service:" into the Add JMX Connection dialog of jvisualvm:  or into the New Connection dialog of jconsole: (The full string is truncated in the on-screen display, but if you copied from the server.log and pasted into the form it should all be there.) The examples above are for a DAS, and your host will probably be different.   The server.log files for other GlassFish servers (instances) will have similar log entries giving the JMX connection string to use for those processes.  Look for the host and/or port to be different. Note a few things about security: Here we've assumed you are using the default admin username and password.  If you are not, just enter a valid admin username and password for your installation.  Once connected, you have normal access to all the JVM statistics and controls. You can use JMX clients that support MBeans to view the GlassFish configuration.  When you connect to the DAS, you can also change that configuration, but you can only view configuration when you connect to an instance. To use a JMX client on one system to connect to a GlassFish server running on another system, you need to enable secure admin if you have not already done so: asadmin change-admin-password (respond to the prompts) asadmin enable-secure-admin asadmin restart-domain (as prompted in the output from enable-secure-admin)

    Read the article

  • Zombiewood for your Java ME tech-enabled Nokia C3

    - by hinkmond
    Zombies... Zoooombies... Here come the zombies in the new Zombiewood game for your Java ME technology-enabled Nokia C3. Watch the video to check it out. See: Zombiewood on Java ME Nokia C3 If you had two handguns and a couple sticks of dynamite, I'm sure you'd be looking to shoot zombies and collect giant floating gold coins spinning on the sidewalk. 'Cause that's what you do in that situation, right? Hinkmond

    Read the article

  • What's new with Java technology? Java Embedded

    - by hinkmond
    As this article points out, Java Embedded is a safer, more robust and easier to develop platform for small networked devices. So, get ready for good things to come from Java Embedded... See: Java Embedded: Next New Thing Here's a quote: Through the past few years the industry as we know it has seen a big boom with the mobile and cloud revolution. Today, there has been an enormous amount of buzz around machine to machine (M2M) or the "Internet of Things," since we are moving into a state where everything is going to have to be interconnected and will have to properly communicate together... Today, Java Embedded provides that platform. I like it! As long as there's no Zombie Apocalypse, I think Java Embedded has a great future! Hinkmond

    Read the article

  • Reading OpenDocument spreadsheets using C#

    - by DigiMortal
    Excel with its file formats is not the only spreadsheet application that is widely used. There are also users on Linux and Macs and often they are using OpenOffice and other open-source office packages that use ODF instead of OpenXML. In this post I will show you how to read Open Document spreadsheet in C#. Importer as example My previous post about importers showed you how to build flexible importers support to your web application. This post introduces you practical example of one of my importers. Of course, sensitive code is omitted. We start with ODS importer class and we add new methods as we go. public class OdsImporter : ImporterBase {     public OdsImporter()     {     }       public override string[] SupportedFileExtensions     {         get { return new[] { "ods" }; }     }       public override ImportResult Import(Stream fileStream, long companyId, short year)     {         string contentXml = GetContentXml(fileStream);           var result = new ImportResult();         var doc = XDocument.Parse(contentXml);           var rows = doc.Descendants("{urn:oasis:names:tc:opendocument:xmlns:table:1.0}table-row").Skip(1);           foreach (var row in rows)         {             ImportRow(row, companyId, year, result);         }           return result;     } } The class given here just extends base class for importers (previous post uses interface but as I already told there you move to abstract base class when writing code for real projects). Import method reads data from *.ods file, parses it (it is XML), finds all data rows and imports data. As you may see then first row is skipped. This is because the first row on my sheet is always headers row. Reading ODS file Our import method starts with getting XML from *.ods file. ODS files like OpenXml files are zipped containers that contain different files. We need content.xml as all data is kept there. To get the contents of file we use SharpZipLib library to read uploaded file as *.zip file. private static string GetContentXml(Stream fileStream) {     var contentXml = "";       using (var zipInputStream = new ZipInputStream(fileStream))     {         ZipEntry contentEntry = null;         while ((contentEntry = zipInputStream.GetNextEntry()) != null)         {             if (!contentEntry.IsFile)                 continue;             if (contentEntry.Name.ToLower() == "content.xml")                 break;         }           if (contentEntry.Name.ToLower() != "content.xml")         {             throw new Exception("Cannot find content.xml");         }           var bytesResult = new byte[] { };         var bytes = new byte[2000];         var i = 0;           while ((i = zipInputStream.Read(bytes, 0, bytes.Length)) != 0)         {             var arrayLength = bytesResult.Length;             Array.Resize<byte>(ref bytesResult, arrayLength + i);             Array.Copy(bytes, 0, bytesResult, arrayLength, i);         }         contentXml = Encoding.UTF8.GetString(bytesResult);     }     return contentXml; } If here is content.xml file then we stop browsing the file. We read this file to memory and return it as UTF-8 format string. Importing rows Our last task is to import rows. We use special method for this as we have to handle some tricks here. To keep files smaller the cell count on row is not always the same. If we have more than one empty cell one after another then ODS keeps only one cell for sequential empty cells. This cell has attribute called number-columns-repeated and it’s value is set to the number of sequential empty cells. This is why we use two indexers for cells collection. private void ImportRow(XElement row, ImportResult result) {     var cells = (from c in row.Descendants()                 where c.Name == "{urn:oasis:names:tc:opendocument:xmlns:table:1.0}table-cell"                 select c).ToList();       var dto = new DataDto();       var count = cells.Count;     var j = -1;       for (var i = 0; i < count; i++)     {         j++;         var cell = cells[i];         var attr = cell.Attribute("{urn:oasis:names:tc:opendocument:xmlns:table:1.0}number-columns-repeated");         if (attr != null)         {             var numToSkip = 0;             if (int.TryParse(attr.Value, out numToSkip))             {                 j += numToSkip - 1;             }         }           if (i > 30) break;         if (j == 0)         {             dto.SomeProperty = cells[i].Value;         }         if (j == 1)         {             dto.SomeOtherProperty = cells[i].Value;         }         // some more data reading     }       // save data } You can define your own class for import results and add there all problems found during data import. Your application gets the results and shows them to user. Conclusion Reading ODS files may seem to complex task but actually it is very easy if we need only data from those documents. We can use some zip-library to get the content file and then parse it to XML. It is not hard to go through the XML but there are some optimization tricks we have to know. The code here is safe to use in web applications as it is not using any API-s that may have special needs to server and infrastructure.

    Read the article

  • Les Techn'Oracle Et Les Sales'Oracle Sont Là !

    - by swalker
    Les Techn'Oracle et les Sales'Oracle sont lancées en région. Comme chaque année, plusieurs centaines de collaborateurs de nos Partenaires vont se former sur nos offres et nos produits les plus récents ! Les Techn'Oracle visent à présenter le contenu technique de nos produits, et leurs avantages pour les Clients, en terme d'architecture, de TCO, de flexibilité pour les Systèmes d'Information. Les Sales'Oracle ont pour objet de montrer la valeur de nos offres en termes Clients, le "à quoi ça sert" pour permettre aux Partenaires d'augmenter leur présence et leurs revenus chez nos Clients >> Rendez-vous sur le Calendrier pour les dates et les thèmes ici

    Read the article

  • Neuerungen bei der Spezialisierung von VADs

    - by swalker
    Ab 1. November 2011 müssen VADs (Value Added Distributors) mit einer gültigen VAD-Vereinbarung für eine Spezialisierung nicht mehr die Kundenreferenz-Anforderungen erfüllen, die im Abschnitt zu den Geschäftskriterien aufgeführt sind. Die VADs müssen jedoch auch weiterhin alle Geschäfts- und Kompetenzkriterien in der entsprechenden Knowledge Zone erfüllen, bevor Ihre Spezialisierung anerkannt wird.

    Read the article

  • Az &eacute;rem m&aacute;sik oldala

    - by peter.nagy
    Az Exalogic kapcsán, ha már valaki hallott róla, akkor leginkább a teljesítmény ugrik be. Mint normálisan majdnem minden szerver megoldás új verziójánál. Nos ez nyilvánvalóan nincs másképp itt sem, amire büszkék is lehetünk. Ugyanakkor nagyon fontos tulajdonsága az Exalogicnak, hogy szinte azonnal hadrafogható. Ez persze már részben elérheto, mivel képesek vagyunk kész virtuális gépeket letölteni komplett telepítéssel. Persze ott még a vasra külön fel kell húzni a virtualizációs szoftvert. Na és itt jönnek a problémák. Bárki aki már látott, csinált, vagy végig nézte egy rendszerhez szükséges infrastruktúra felhúzását az tudja mirol beszélek. A telepítés, megfelelo verziók, szükséges patch-ek, hálózati eszközök bekötése. Aztán mindennek a konfigurálása. A legjobb esetben is napok, de inkább hetek kérdése mire üzemkész lenne. Az Exalogic esetében ez más, hiszen gyárilag összereszelték, a konfiguráció kialakítása gyakorlatilag néhány paraméter file és kész. De lássuk is, hogy mennyi ido onnan, hogy kibontottuk a dobozból. Mindössze egy nap. A hab a tortán pedig a full stack upgrade. Nem külön minden egyes komponenst. Firmware-tol a WebLogic szerverig.

    Read the article

  • RPi and Java Embedded GPIO: Java code to blink more LEDs

    - by hinkmond
    Now, it's time to blink the other GPIO ports with the other LEDs connected to them. This is easy using Java Embedded, since the Java programming language is powerful and flexible. Embedded developers are not used to this, since the C programming language is more popular but less easy to develop in. We just need to use a dynamic Java String array to map to the pinouts of the GPIO port names from the previous diagram posted. This way we can address each "channel" with an index into that String array. static String[] GpioChannels = { "0", "1", "4", "17", "21", "22", "10", "9" }; With this new dynamic array, we can streamline the main() of this Java program to activate all the ports. /** * @param args the command line arguments */ public static void main(String[] args) { FileWriter[] commandChannels; try { /*** Init GPIO port for output ***/ // Open file handles to GPIO port unexport and export controls FileWriter unexportFile = new FileWriter("/sys/class/gpio/unexport"); FileWriter exportFile = new FileWriter("/sys/class/gpio/export"); for (String gpioChannel : GpioChannels) { System.out.println(gpioChannel); // Reset the port unexportFile.write(gpioChannel); unexportFile.flush(); // Set the port for use exportFile.write(gpioChannel); exportFile.flush(); // Open file handle to port input/output control FileWriter directionFile = new FileWriter("/sys/class/gpio/gpio" + gpioChannel + "/direction"); // Set port for output directionFile.write(GPIO_OUT); directionFile.flush(); } And, then simply add array code to where we blink the LED to make it blink all the LEDS on and off at once. /*** Send commands to GPIO port ***/ commandChannels = new FileWriter[GpioChannels.length]; for (int channum=0; channum It's easier than falling off a log... or at least easier than C programming. Hinkmond

    Read the article

  • The Sound of Two Toilets Flushing: Constructive Criticism for Virgin Atlantic Complaints Department

    - by Geertjan
    I recently had the experience of flying from London to Johannesburg and back with Virgin Atlantic. The good news was that it was the cheapest flight available and that the take off and landing were absolutely perfect. Hence I really have no reason to complain. Instead, I'd like to offer some constructive criticism which hopefully Richard Branson will find sometime while googling his name. Or maybe someone from the Virgin Atlantic Complaints Department will find it, whatever, just want to put this information out there. Arrangement of restroom facilities. Maybe next time you design an airplane, consider not putting your toilets at a right angle right next to your rows of seats. Being able to reach, without even needing to stretch your arm, from your seat to close, yet again, a toilet door that someone, someone obviously sitting very far from the toilets, carelessly forgot to close is not an indicator of quality interior design. Have you noticed how all other airplanes have their toilets in a cubicle separated from the rows of seats? On those airplanes, people sitting in the seats near the toilets are not constantly being woken up throughout the night whenever someone enters/exits the toilet, whenever the light in the toilet is suddenly switched on, and whenever one of the toilets flushes. Bonus points for Virgin Atlantic passengers in the seats adjoining the toilets is when multiple toilets are flushed simultaneously and multiple passengers enter/exit them at the same time, a bit like an unasked for low budget musical of suddenly illuminated grumpy people in crumpled clothes. What joy that brings at 3 AM is hard to describe. Seats with extra leg room. You know how other airplanes have the seats with the extra leg room? You know what those seats tend to have? Extra leg room. It's really interesting how Virgin Atlantic's seats with extra leg room actually have no extra leg room at all. It should have been a give away, the fact that these special seats are found in the same rows as the standard seats, rather than on the cusp of real glory which is where most airlines put their extra leg room seats, with the only actual difference being that they have a slightly different color. Had you called them "seats with a different color" (i.e., almost not quite green, rather than something vaguely hinting at blue), at least I'd have known what I was getting. Picture the joy at 3 AM, rudely awakened from nightmarish slumber, partly grateful to have been released from a grayish dream of faceless zombies resembling one or two of those in a recent toilet line, by multiple adjoining toilets flushing simultaneously, while you're sitting in a seat with extra leg room that has exactly as much leg room as the seats in neighboring rows. You then have a choice of things to be sincerely annoyed about. Food from the '80's. In the '80's, airplane food came in soggy containers and even breakfast, the most important meal of the day, was a sad heap of vaguely gray colors. The culinary highlight tended to be a squashed tomato, which must have been mashed to a pulp with a brick prior to being regurgitated by a small furry animal, and there was also always a piece of immensely horrid pumpkin, as well as a slice of spongy something you'd never seen before. Sausages and mash at 6 AM on an airplane was always a heavy lump of horribleness. Thankfully, all airlines throughout the world changed from this puke inducing strategy around 1987 sometime. Not Virgin Atlantic, of course. The fatty sausages and mash are still there, bringing you flashbacks to Duran Duran, which is what you were listening to (on your walkman) the last time you saw it in an airplane. Even the golden oldie "squashed tomato attached by slime to three wet peas" is on the menu. How wonderful to have all this in a cramped seat with a long row of early morning bleariness lined up for the toilets, right at your side, bumping into your elbow, groggily, one by one, one after another, more and more, fumble-open-door-silence-flush-fumble-open-door, and on and on, while you tentatively push your fork through a soggy pile of colorless mush, fighting the urge to throw up on the stinky socks of whatever nightmarish zombie is bumping into your elbow at the time. But, then again, the plane landed without a hitch, in fact, extremely smoothly, so I'm certainly not blaming the pilots.

    Read the article

  • EclipseLink Moxy Provider for JAX-RS and JAX-WS

    - by arungupta
    EclipseLink MOXy is a JAXB provider bundled in GlassFish 3.1.2. In addition to JAXB RI, it provides XPath Based Mapping, better support for JPA entities, native JSON binding and many other features. Learn more about MOXy and JAXB examples on their wiki. Blaise blogged about how MOXy can be leveraged to create a JAX-WS service.You just need to provide data-binding attribute in sun-jaxws.xml and then all the XPath-based mapping can be specified on JAXB beans. MOXy can also be used as JAX-RS JSON provider on server-side and client-side. How are you using MOXy in your applications ?

    Read the article

  • New Analytic settings for the new code

    - by Steve Tunstall
    If you have upgraded to the new 2011.1.3.0 code, you may find some very useful settings for the Analytics. If you didn't already know, the analytic datasets have the potential to fill up your OS hard drives. The more datasets you use and create, that faster this can happen. Since they take a measurement every second, forever, some of these metrics can get in the multiple GB size in a matter of weeks. The traditional 'fix' was that you had to go into Analytics -> Datasets about once a month and clean up the largest datasets. You did this by deleting them. Ouch. Now you lost all of that historical data that you might have wanted to check out many months from now. Or, you had to export each metric individually to a CSV file first. Not very easy or fun. You could also suspend a dataset, and have it not collect data at all. Well, that fixed the problem, didn't it? of course you now had no data to go look at. Hmmmm.... All of this is no longer a concern. Check out the new Settings tab under Analytics... Now, I can tell the ZFSSA to keep every second of data for, say, 2 weeks, and then average those 60 seconds of each minute into a single 'minute' value. I can go even further and ask it to average those 60 minutes of data into a single 'hour' value.  This allows me to effectively shrink my older datasets by a factor of 1/3600 !!! Very cool. I can now allow my datasets to go forever, and really never have to worry about them filling up my OS drives. That's great going forward, but what about those huge datasets you already have? No problem. Another new feature in 2011.1.3.0 is the ability to shrink the older datasets in the same way. Check this out. I have here a dataset called "Disk: I/O opps per second" that is about 6.32M on disk (You need not worry so much about the "In Core" value, as that is in RAM, and it fluctuates all the time. Once you stop viewing a particular metric, you will see that shrink over time, just relax).  When one clicks on the trash can icon to the right of the dataset, it used to delete the whole thing, and you would have to re-create it from scratch to get the data collecting again. Now, however, it gives you this prompt: As you can see, this allows you to once again shrink the dataset by averaging the second data into minutes or hours. Here is my new dataset size after I do this. So it shrank from 6.32MB down to 2.87MB, but i can still see my metrics going back to the time I began the dataset. Now, you do understand that once you do this, as you look back in time to the minute or hour data metrics, that you are going to see much larger time values, right? You will need to decide what size of granularity you can live with, and for how long. Check this out. Here is my Disk: Percent utilized from 5-21-2012 2:42 pm to 4:22 pm: After I went through the delete process to change everything older than 1 week to "Minutes", the same date and time looks like this: Just understand what this will do and how you want to use it. Right now, I'm thinking of keeping the last 6 weeks of data as "seconds", and then the last 3 months as "Minutes", and then "Hours" forever after that. I'll check back in six months and see how the sizes look. Steve 

    Read the article

  • Session Report - Modern Software Development Anti-Patterns

    - by Janice J. Heiss
    In this standing-room-only session, building upon his 2011 JavaOne Rock Star “Diabolical Developer” session, Martijn Verburg, this time along with Ben Evans, identified and explored common “anti-patterns” – ways of doing things that keep developers from doing their best work. They emphasized the importance of social interaction and team communication, along with identifying certain psychological pitfalls that lead developers astray. Their emphasis was less on technical coding errors and more how to function well and to keep one’s focus on what really matters. They are the authors of the highly regarded The Well-Grounded Java Developer and are both movers and shakers in the London JUG community and on the Java Community Process. The large room was packed as they gave a fast-moving, witty presentation with lots of laughs and personal anecdotes. Below are a few of the anti-patterns they discussed.Anti-Pattern One: Conference-Driven DeliveryThe theme here is the belief that “Real pros hack code and write their slides minutes before their talks.” Their response to this anti-pattern is an expression popular in the military – PPPPPP, which stands for, “Proper preparation prevents piss-poor performance.”“Communication is very important – probably more important than the code you write,” claimed Verburg. “The more you speak in front of large groups of people the easier it gets, but it’s always important to do dry runs, to present to smaller groups. And important to be members of user groups where you can give presentations. It’s a great place to practice speaking skills; to gain new skills; get new contacts, to network.”They encouraged attendees to record themselves and listen to themselves giving a presentation. They advised them to start with a spouse or friends if need be. Learning to communicate to a group, they argued, is essential to being a successful developer. The emphasis here is that software development is a team activity and good, clear, accessible communication is essential to the functioning of software teams. Anti-Pattern Two: Mortgage-Driven Development The main theme here was that, in a period of worldwide recession and economic stagnation, people are concerned about keeping their jobs. So there is a tendency for developers to treat knowledge as power and not share what they know about their systems with their colleagues, so when it comes time to fix a problem in production, they will be the only one who knows how to fix it – and will have made themselves an indispensable cog in a machine so you cannot be fired. So developers avoid documentation at all costs, or if documentation is required, put it on a USB chip and lock it in a lock box. As in the first anti-pattern, the idea here is that communicating well with your colleagues is essential and documentation is a key part of this. Social interactions are essential. Both Verburg and Evans insisted that increasingly, year by year, successful software development is more about communication than the technical aspects of the craft. Developers who understand this are the ones who will have the most success. Anti-Pattern Three: Distracted by Shiny – Always Use the Latest Technology to Stay AheadThe temptation here is to pick out some obscure framework, try a bit of Scala, HTML5, and Clojure, and always use the latest technology and upgrade to the latest point release of everything. Don’t worry if something works poorly because you are ahead of the curve. Verburg and Evans insisted that there need to be sound reasons for everything a developer does. Developers should not bring in something simply because for some reason they just feel like it or because it’s new. They recommended a site run by a developer named Matt Raible with excellent comparison spread sheets regarding Web frameworks and other apps. They praised it as a useful tool to help developers in their decision-making processes. They pointed out that good developers sometimes make bad choices out of boredom, to add shiny things to their CV, out of frustration with existing processes, or just from a lack of understanding. They pointed out that some code may stay in a business system for 15 or 20 years, but not all code is created equal and some may change after 3 or 6 months. Developers need to know where the code they are contributing fits in. What is its likely lifespan? Anti-Pattern Four: Design-Driven Design The anti-pattern: If you want to impress your colleagues and bosses, use design patents left, right, and center – MVC, Session Facades, SOA, etc. Or the UML modeling suite from IBM, back in the day… Generate super fast code. And the more jargon you can talk when in the vicinity of the manager the better.Verburg shared a true story about a time when he was interviewing a guy for a job and asked him what his previous work was. The interviewee said that he essentially took patterns and uses an approved book of Enterprise Architecture Patterns and applied them. Verburg was dumbstruck that someone could have a job in which they took patterns from a book and applied them. He pointed out that the idea that design is a separate activity is simply wrong. He repeated a saying that he uses, “You should pay your junior developers for the lines of code they write and the things they add; you should pay your senior developers for what they take away.”He explained that by encouraging people to take things away, the code base gets simpler and reflects the actual business use cases developers are trying to solve, as opposed to the framework that is being imposed. He told another true story about a project to decommission a very long system. 98% of the code was decommissioned and people got a nice bonus. But the 2% remained on the mainframe so the 98% reduction in code resulted in zero reduction in costs, because the entire mainframe was needed to run the 2% that was left. There is an incentive to get rid of source code and subsystems when they are no longer needed. The session continued with several more anti-patterns that were equally insightful.

    Read the article

  • YouTube: Promotional AgroSense Movie

    - by Geertjan
    Here's a cool YouTube promotional movie on AgroSense created by Ordina in the Netherlands. AgroSense is an open source Java system for the precision agriculture industry, which won the IT Environment Award in the Netherlands last week: If your understanding of Dutch limits your appreciation of the movie above, here's a rough translation, together with the names of the speakers in the movie: Precision agriculture, an innovative form of agriculture in which local variations in soil, crop, and atmosphere are taken into account, is the high-tech sustainable agriculture of tomorrow. The use of fertilizer, water, and energy can in this way be significantly reduced. "If, ten or twenty years from now, we are to continue having our agricultural industry in good shape, and in a continuing state of health, we'll need to register and work with data because if we want to enable crops to provide higher value, we'll need to create higher levels of transparency throughout the agriculture chain." Lenus Hamster, farmer in Nieuwolda Groningen "Industry is becoming increasingly data intensive. By combining pragmatic usefulness with innovative sustainability, AgroSense offers the Netherlands the possibility to continue being a leading player in the agrofood sector." Art Lighthart, Architect at Ordina AgroSense offers an open source solution in which all services for precision agriculture are brought together. In 2012, co-operation is being sought with organizations to make AgroSense available to around 10,000 Dutch farmers in the arable crop sector. By the way, the last sentence above implies the NetBeans Platform will be used by around 10,000 Dutch farmers.

    Read the article

  • JDeveloper and Upgrading Your JDK on Ubuntu

    - by Duncan Mills
    One little gotcha, if you, as I did recently, upgrade your JDK on Ubuntu then you may have to make sure you reflect that change in a couple of places for JDeveloper to stay happy. Assuming that you've installed from the jar version of the JDeveloper installer, then the JDK that you specified at install time will be recorded in the .jdev_jdk file in your home directory.However, be aware that this is not the only reference to the absolute path of the JDK. When you run the embedded WebLogic for the first time then the .jdeveloper/system11.1.1.3.37.56.60/DefaultDomain/bin/startWebLogic.shscript will be created, and associated with that, the setDomainEnv.sh script in the same directory. So, if you do want to change the JDK location be sure to change this file as well. (or of course do everything with symbolic links)

    Read the article

  • AT&T Application Resource Analyzer in NetBeans IDE

    - by Geertjan
    Here at Øredev in Malmö I met Doug Sillars who does developer outreach for the AT&T Application Resource Optimizer. In this YouTube clip you see Doug explaining how it works and what it can do for optimizing performance of mobile applications. There's a free and open source Android app on GitHub that you can install on Android to collect data and then there's a Java Swing application for analyzing the results. And here's what that application looks like as a plugin in NetBeans IDE, click to enlarge the image, which shows the Android sources of the Data Collector, as well as the Data Analyzer ready to be used to collect data: Since the ARO Data Analyzer is written in Java and has JPanels defining its UI layer, integrating the user interface wasn't hard. Now working on the Actions, so there'll be a new ARO menu with start/stop data collecting menu items, etc, reusing as much of the original code as possible. That part is actually already working. I started up an Android emulator, then started the data collection process from the IDE. Now need to include the Actions for importing the data into the analyzer, together with a few other related features. A pretty cool feature in ARO is video capture, so that a movie can be made by ARO of all the steps taken on the device during the collection process, which will also be nice to have integrated into the NetBeans plugin. Ultimately, this will be handy for anyone creating Android applications in NetBeans IDE since they'll be able to use AT&T's ARO tool for optimizing the performance of the applications they're developing. It will also be useful for those using the built-in Cordova tools in NetBeans IDE to create iOS applications because ARO is also applicable to analyzing iOS application performance.

    Read the article

  • Be There: Tinkerforge/NetBeans Platform Integration Course

    - by Geertjan
    Tinkerforge is an electronic construction kit. It exposes a number of API bindings, including, of course, Java. The nice thing also is that Tinkerforge products are open source, both on the hardware and software levels, so that you can take their bases as a starting point for your own modifications. "The TinkerForge system is a set of pre-built electronics boards that are built in such a way that you can stack the boards (known as bricks), attach accessories (known as bricklets), and have your prototype and and running quickly. Unlike systems, such as the Arduino or Launchpad, the TinkerForge has to be attached to a computer and the computer does all of the work. With an easy set of application programming interfaces (APIs) available in C/C++, C#, Java, PHP, and Ruby, the system is easy to interface and program over USB in a snap." (from this useful article) Henning Krüp, who has arranged several NetBeans Platform Certified Training Courses in the past, in the Nordhorn/Lingen area in Germany, had the inspired idea to focus the next course on integration with Tinkerforge. In other words, the whole course will be focused on creating a standalone Java desktop application that leverages the NetBeans Platform to interact with Tinkerforge! Interested in joining the course or setting up something similar yourself? The course organized by Henning will be held from 19 to 21 September, as explained here, together with contact details.  If you'd like to organize a similar course at a location of your choosing, leave a comment at the end of this blog entry and we'll set something up together!

    Read the article

  • Kronos Workforce Mobile Apps (w/Java ME tech) lets bosses and staff work better

    - by hinkmond
    The Kronos Workforce Mobile apps let bosses spy on their workers, and let workers do what workers do best (uh, you know, work?), all using Java ME technology. See: Enable your Mobile Workforce w/Kronos Here's a quote: Kronos® Workforce Mobile™ Manager – allows managers to use their devices to monitor workforce operations, resolve exceptions, and respond quickly to employee requests. Kronos Workforce Mobile Employee – enables employees to track their work in real time, quickly and easily review information such as their schedules and timecards, and request time off. Kronos mobile applications are delivered as native applications for [blah-blah-blah]. A JavaME option is also available, which runs on a wide range of feature phones. Good stuff for the enterprise. Java ME technology helps run the mobile enterprise. I like that. Kinda catchy... Hinkmond

    Read the article

  • Meet up with the JCP at JavaOne Latin America

    - by Heather VanCura
    The JCP made it to JavaOne Brazil!  We had a quickie presentation earlier today on JCP.Next that was well attended.  Come to see us at@ the  OTN mini-theatre tomorrow from 12:00-12:15 pm for a quickie on participation.  Then make your way to the Mazanino Sala 12 at 12:30 pm for CON-22250.  "The Java Community Process: How You Can Make a Positive Difference" will be presented with Heather VanCura, JCP,  and Fabio Velloso, SouJava, on Thursday, 6 December, at 12:30 pm.  Find out more about how to participate in the JCP program, the JCP.Next effort and how to get involved with Adopt-a-JSR through your JUG (or on your own)!  Here is the description in Portuguese: A JCP desempenha um papel fundamental na evolução do Java. A sessão vai enfatizar o valor da transparência e participação através da JCP, Grupos de Usuários Java e do programa Adote um JSR. Vamos explorar também algumas das mudanças futuras no processo através da iniciativa JCP.Next, e explicar como você pode se envolver. Traga suas dúvidas, suas sugestões, e suas preocupações. Nós queremos ouvir de você, e incentivá-lo e facilitar a sua participação ativa no avanço da plataforma Java

    Read the article

  • Update on PeopleSoft 9.0

    Doris Wong, Group Vice President and General Manager of PeopleSoft Enterprise updates listeners on the new capabilities of PeopleSoft 9.0, the customer momentum with this new release and why more PeopleSoft customers should consider upgrading to this new release.

    Read the article

  • OBIEE 11g recommended patch sets

    - by THE
     Martin has busied himself to combine the recommended patch sets for OBIEE 11g into one single useful KM note.(This one contains the recommendations for 11.1.1.5 as well as those for 11.1.1.6) OBIEE 11g: Required and Recommended Patches and Patch Sets (Doc ID 1488475.1) So if you are looking for update/patch information for your OBIEE installation - this is most likely a useful stop. And as patching is an ongoing process you may want to bookmark this KM doc, as I am sure Martin will keep this current as new patches come out. Oh - and if you are looking for upgrade information from 11.1.1.5 to 11.1.1.6, KM Doc ID 1434253.1 might just be the thing you are looking for.

    Read the article

  • Architecture: Bringing Value to the Table

    - by Bob Rhubart
    A recent TechTarget article features an interview with Business Architecture expert William Ulrich (Take a business-driven approach to application modernization ). In that article Ulrich offers this advice: "Moving from one technical architecture might be perfectly viable on a project by project basis, but when you're looking at the big picture and you want to really understand how to drive business value so that the business is pushing money into IT instead of IT pulling money back, you have to understand the business architecture. When we do that we're going to really be able to start bringing value to the table." In many respects that big picture view is what software architecture is all about. As an architect, your technical skills must be top-notch. But if you don't apply that technical knowledge within the larger context of moving the business forward, what are you accomplishing? If you're interested in more insight from William Ulrich, you can listen to the ArchBeat Podcast interview he did last year, in which he and co-author Neal McWhorter talked about their book, Business Architecture: The Art and Practice of Business Transformation.

    Read the article

  • jtreg update, December 2012

    - by jjg
    There is a new version of jtreg available. The primary new feature is support for tests that have been written for use with TestNG, the popular open source testing framework. TestNG is supported by a variety of tools and plugins, which means that it is now possible to develop tests for OpenJDK using those tools, while still retaining the ability to have the tests be part of the OpenJDK test suite, and run with a single test harness, jtreg. jtreg can be downloaded from the OpenJDK jtreg page: http://openjdk.java.net/jtreg. TestNG support jtreg supports both single TestNG tests, which can be freely intermixed with other types of jtreg tests, and groups of TestNG tests. A single TestNG test class can be compiled and run by providing a test description using the new action tag: @run testng classname The test will be executed by using org.testng.TestNG. No main method is required. A group of TestNG tests organized in a standard package hierarchy can also be compiled and run by jtreg. Any such group must be identified by specifying the root directory of the package hierarchy. You can either do this in the top level TEST.ROOT file, or in a TEST.properties file in any subdirectory enclosing the group of tests. In either case, add a line to the file of the form: TestNG.dirs = dir ... Directories beginning with '/' are evaluated relative to the root directory of the test suite; otherwise they are evaluated relative to the directory containing the declaring file. In particular, note that you can simply use "TestNG.dirs = ." in a TEST.properties file in the root directory of the test group's package hierarchy. No additional test descriptions are necessary, but test descriptions containing information tags, such as @bug, @summary, etc are permitted. All the Java source files in the group will be compiled if necessary, before any of the tests in the group are run. The selected tests within the group will be run, one at a time, using org.testng.TestNG. Library classes The specification for the @library tag has been extended so that any paths beginning with '/' will be evaluated relative to the root directory of the test suite. In addition, some bugs have been fixed that prevented sharing the compiled versions of library classes between tests in different directories. Note: This has uncovered some issues in tests that use a combination of @build and @library tags, such that some tests may fail unexpectedly with ClassNotFoundException. The workaround for now is to ensure that library classes are listed before the test classes in any @build tags. To specify one or more library directories for a group of TestNG tests, add a line of the following form to the TEST.properties file in the root directory of the group's package hierarchy: lib.dirs = dir ... As before, directories beginning with '/' are evaluated relative to the root directory of the test suite; otherwise they are evaluated relative to the directory containing the declaring file. The libraries will be available to all classes in the group; you cannot specify different libraries for different tests within the group. Coming soon ... From this point on, jtreg development will be using the new jtreg repository in the OpenJDK code-tools project. There is a new email alias jtreg-dev at openjdk.java.net for discussions about jtreg development. The existing alias jtreg-use at openjdk.java.net will continue to be available for questions about using jtreg. For more information ... An updated version of the jtreg Tag Language Specification is being prepared, and will be made available when it is ready. In the meantime, you can find more information about the support for TestNG by executing the following command: $ jtreg -onlinehelp TestNG For more information on TestNG itself, visit testng.org.

    Read the article

  • Data Governance (Veri Yönetisimi)

    - by Arda Eralp
    Data governance,veri ile ilgili islemler için bir sorumluluklar sistemidir. Bu sistemin temelini ise politikalar, standartlar ve prosedürler olusturur. Sistem politikalar, standartlar ve prosedürler sayesinde verinin ne zaman, hangi sartlar altinda, hangi eylemlerde, hangi yöntemler ile kimler tarafindan kullanilacagina karar verir. Sistemin kurumda basarili bir sekilde islemesi için öncelikle kurumda farkindalik saglanmasi gereklidir. Farkindalik saglandiktan sonra ise kurum governance ve mimari kültürünü benimsemelidir. Ancak bu sartlar altinda sistem basarili bir sekilde isleyebilecektir. Bu sebeplerden dolayidir ki data governance kisa bir süreç degil, aksine kurum varligini sürdürdügü sürece isleyecek olan bir süreçtir. Bu durum bize data governance in bir proje degil bir program oldugunu açiklamaktadir. Programin baslangicinda kurumun ihtiyaçlarinin netlesmesi ve farkindaligin saglanmasi temeldir. Hedef kitle ise, veri ile dogrudan ve ya dolayli olarak iliski içerisinde olan herkesdir. Bu sebeple programin baslangicinda hedef kitleyi içeren ekipler ile toplantilar düzenlenecektir. Bu toplantilar sayesinde hem farkindalik saglanacak hemde ekiplerin ihtiyaçlari birebir ekipler tarafindan aktarilarak netlesecektir. Hedef kitlenin ihtiyaçlari netlestirildikten sonra ise devamli isleyecek olan bu sürecin planlamasi yapilacaktir. Bu sürecin planlanmasinda ihtiyaçlarin önceliklendirilmesi gerekmektedir. Sebebi ise her ekibin ihtiyaçlarinin farkli olabilecegi ve bütün ihtiyaçlara ayni anda karsilik verilemeyebileceginin öngörülmesidir. Bu öngörünün temeli ise ekiplerin ihtiyaçlarinin birbirleriyle olan baglantisidir. Süreç planlamasinda ihtiyaçlarin önceliklendirilmesinin ardindan kurumun büyüklügünün gözönünde bulundurulmasi gerekmedir. Kurumun büyüklügünün önemi ise eger kurum bir bütün olarak ayni anda govern edilemeyecek kadar büyük ise ihtiyaçlari öncelikli olarak bulunan ekipler ile govern edilmesine baslanarak sürecin belirli bir hiz ile bütün kurumda isler hale getirilmesini saglamaktir. Ihtiyaçlar belirlendikten ve ilgili ekipler seçildikten sonra artik programin planlanmasina geçilebilecek. Programin planlama asamasinda öncelikli olarak sürecin asamalarini kontrol edecek ve süreç kurum içerisinde isleyise geçtiginde kontrolü saglayacak olan Data Governance Office in planlanmasidir. Office in planlanmasiyla birlikte süreçteki roller ve bu rollerin sorumluluklari belirlenecektir. Planlama asamasinda Data governance office, roller ve sorumluluklar, güvenlik ve veri saklanan sistemler ele alinacak konulardir. Planlama asamasi tamamlandiginda ise belirlenen ekipler ve ihtiyaçlar dogrultusunda programin isleyis asamasina geçilebilecektir. Isleyis kisminda ekibin ihtiyaçlari dogrultusunda güvenlik konusunda ve veri saklanan sistemler üzerinde çalismalar yapilacaktir. Bu yapilan çalismalar bir süreç olarak dökümante edilecek ve süreç sona erdiginde baska bir ekiple baska bir ihtiyaç dogrultusunda çalisma yapilarak ayni süreç isletilecek ve böylece kurum içesinde ilgili süreçte standartlasma saglanacaktir. Güvenlik konusunda verinin erisim güvenligi ve kullanim güvenligi ele alinacaktir. Veri saklanan sistemler üzerindeki çalismalar ise saklanan sistemlerin program dahilinde belirlenen standartlar ile olusturulmasi ve yönetilmesi saglanacaktir. Isleyis kisminin ardindan ise programin izleme kismina geçilecektir. Bu kisimda artik Data Governance Office olusmus, politikalar, standartlar ve prosedürler belirlenmistir. Ve Data Governance Office çalisanlari rolleri ve sorumluluklari dahilinde programin isleyisini izleyecek ve gerek gördügünde politikalar standartlar ve prosedürler üzerinde degisiklikler yapacaklardir.

    Read the article

< Previous Page | 669 670 671 672 673 674 675 676 677 678 679 680  | Next Page >