Search Results

Search found 9952 results on 399 pages for 'more details'.

Page 259/399 | < Previous Page | 255 256 257 258 259 260 261 262 263 264 265 266  | Next Page >

  • Why is my laptop so sluggish? Or Damn You Facebook and Twitter! Or All Hail Chrome!

    - by John Conwell
    In the past three weeks, I've noticed that my laptop (dual core 2.1GHz, 2Gb RAM) has become amazingly sluggish.  I only uses for communications and data lookup workflows, so the slowness was tolerable.  But today I finally got fed up with the suckyness and decided to get to the root of the problem (I do have strong performance roots after all). It actually didn't take all that long to figure it out.  About a year ago I converted to Google Chrome (away from FireFox).  One of the great tools Chrome has is a "Task Manager" tool, that gives you Windows Task Manager like details for all the tabs open in the browser (Shift + Esc).  Since every tab runs in its own process, its easy from Task Manager (both Windows or Chrome) to identify and kill a single performance offending tab.  This is unlike IE, where you only get aggregate data about all tabs open.  Anyway, I digress.  Today my laptop sucked.  Windows Task Manager told me that I had two memory hogging Chrome tabs, but couldn't tell me which web page those tabs are showing.  Enter Chrome Task Manager which tells you the page title, along with CPU, memory and network utilization of each tab.  Enter my amazement.  Turns out Facebook was using just shy of half a Gb of RAM.  Half a Gigabyte!  That's 512 Megabytes!524,288 Kilobytes! 536,870,912 Bytes!  Or 4,294,967,296 Bits!  In other words, that's a frackin boat load of memory.  Now consider that Facebook is running on pretty much 96.3% (statistics based on absolutely nothing) of every house hold desktop, laptop, netbook, and mobile device in America, that is pretty horrific! And I wasn't playing any Facebook games like FarmWars or MafiaVille.  I just had my normal, default home page up showing me who just had breakfast, or just got finished with their morning run. I'm sorry...let me say that again...HALF A GIG OF RAM!  That is just unforgivable. I can just see my mom calling me up:  Mom: "John...I think I need a new computer.  Mine is really slow these days" John: "What do you have running?" Mom: "Oh, just Facebook" John: "Ok, close Facebook and tell me how fast your computer feels" Mom: "Well...I don't know how fast it is.  All I do is use Facebook" John: "Ok Mom, I'll send you a new computer by Tuesday" Oh yea...and the other offending web page?  It was Twitter, using a quarter of a Gigabyte. God I love social networks!

    Read the article

  • Impressions from VMworld - Clearing up Misconceptions

    - by Monica Kumar
    Gorgeous sunny weather…none of the usual summer fog…the Oracle Virtualization team has been busy at VMworld in San Francisco this week. From the time exhibits opened on Sunday, our booth staff was fully engaged with visitors. It was great to meet with customers and prospects, and there were many…most with promises to meet again in October at Oracle OpenWorld 2012. Interests and questions ran the gamut - from implementation details to consolidating applications to how does Oracle VM enable rapid application deployment to Oracle support and licensing. All good stuff! Some inquiries are poignant and really help us get at the customer pain points. Some are just based on misconceptions. We’d like to address a couple of common misconceptions that we heard: 1) Rapid deployment of enterprise applications is great but I don’t do this all the time. So why bother? While production applications don’t get updated or upgraded as often, development and QA staging environments are much more dynamic. Also, in today’s Cloud based computing environments, end users expect an entire solution, along with the virtual machine, to be provisioned instantly, on-demand, as and when they need to scale. Whether it’s adding a new feature to meet customer demands or updating applications to meet business/service compliance, these environments undergo change frequently. The ability to rapidly stand up an entire application stack with all the components such as database tier, mid-tier, OS, and applications tightly integrated, can offer significant value. Hand patching, installation of the OS, application and configurations to ensure the entire stack works well together can take days and weeks. Oracle VM Templates provide a much faster path to standing up a development, QA or production stack in a matter of hours or minutes. I see lots of eyes light up as we get to this point of the conversation. 2) Oracle Software licensing on VMware vSphere In the world of multi-vendor IT stacks, understanding license boundaries and terms and conditions for each product in the stack can be challenging.  Oracle’s licensing, though, is straightforward.  Oracle software is licensed per physical processor in the server or cluster where the Oracle software is installed and/or running.  The use of third party virtualization technologies such as VMware is not allowed as a means to change the way Oracle software is licensed.  Exceptions are spelled out in the licensing document labeled “Hard Partitioning". Here are some fun pictures! Visitors to our booth told us they loved the Oracle SUV courtesy shuttles that are helping attendees get to/from hotels. Also spotted were several taxicabs sporting an Oracle banner! Stay tuned for more highlights across desktop and server virtualization as we wrap up our participation at VMworld.

    Read the article

  • Less than 50 Lines of Code to Create a Java Palette in NetBeans

    - by Geertjan
    Want to drag and drop Java code snippets into the palette, in the same way as can be done for HTML files? If so, create a new module and add a class with the content below and you're done. You'll be able to select a piece of Java code, drag it into the palette (Ctrl-Shift-8 to open it), where you'll be able to set a name, tooltip, and icons for the snippet, and then you'll be able to drag it out of the palette into any Java files you like. The palette content is persisted across restarts of the IDE. package org.netbeans.modules.javasourcefilepalette; import java.io.IOException; import javax.swing.Action; import org.netbeans.api.editor.mimelookup.MimeRegistration; import org.netbeans.spi.palette.DragAndDropHandler; import org.netbeans.spi.palette.PaletteActions; import org.netbeans.spi.palette.PaletteController; import org.netbeans.spi.palette.PaletteFactory; import org.openide.util.Exceptions; import org.openide.util.Lookup; import org.openide.util.datatransfer.ExTransferable; public class JavaSourceFileLayerPaletteFactory { private static PaletteController palette = null; @MimeRegistration(mimeType = "text/x-java", service = PaletteController.class) public static PaletteController createPalette() { try { if (null == palette) { return PaletteFactory.createPalette( //Folder: "JavaPalette", //Palette Actions: new PaletteActions() { @Override public Action[] getImportActions() {return null;} @Override public Action[] getCustomPaletteActions() {return null;} @Override public Action[] getCustomCategoryActions(Lookup lkp) {return null;} @Override public Action[] getCustomItemActions(Lookup lkp) {return null;} @Override public Action getPreferredAction(Lookup lkp) {return null;} }, //Palette Filter: null, //Drag and Drop Handler: new DragAndDropHandler(true) { @Override public void customize(ExTransferable et, Lookup lkp) {} }); } } catch (IOException ex) { Exceptions.printStackTrace(ex); } return null; } } In my layer file, I have this content: <folder name="JavaPalette"> <folder name="Snippets"/> </folder> That's all. Run the module. Open a Java source file and the palette will automatically open. Drag some code into the palette and a dialog will pop up asking for some details like display name and icons. Then the snippet will be in the palette and you'll be able to drag and drop it anywhere you like. Use the Palette Manager, which is automatically integrated, to add new categories and show/hide palette items. Related blog entry, for which the above is a big simplification: Drag/Drop Snippets into Palette .

    Read the article

  • Bridging 10GbE with 12.04 - bridging works but the bridging computer has no internet access

    - by Donal
    I have been trying to get 12.04 bridging working with two 10GbE cards. I have 2 10GbE cards in a linux box being used only for this bridge, 1 with 2 10GbaseT ports and another with a single CX4 port. I have 2 client computers connected with 10GbaseT cards and the CX4 card connects to a procurve switch. I can get the bridging happening mostly the way that I want, The clients receive dhcp information from the dhcp server (not the bridging machine) and can connect to and properly see the rest of the network. Speeds are ok, not amazing but working on that is another matter. My problem is that the bridging machine has no internet access ... meaning I can't update anything or apt-get anything It can ping all other machines on the local network. I've tried the helpful hints from: https://help.ubuntu.com/community/NetworkConnectionBridge "Enabling Internet Use on the Bridging Computer" and get the following RTNETLINK answers: File exists but dhclient br0 does nothing for me :( I think if it is anything it a multiple route problem as both br0 and eth4 have ipaddresses ... even though I have only set it up so that br0 has one ... Bridge setup details: /etc/network/interface auto br0 iface br0 inet static address 192.168.0.246 netmask 255.255.255.0 gateway 192.168.0.1 broadcast 192.168.0.255 dns-nameservers 192.168.0.1 dns-search example.com dns-domain example.com #(eth2 & eth3 are the 10GbaseT) #(eth4 is the CX4 connection) pre-up ip link set eth2 down pre-up ip link set eth3 down pre-up ip link set eth4 down pre-up brctl addbr br0 pre-up brctl addif br0 eth4 eth3 eth2 pre-up ip addr flush dev eth3 pre-up ip addr flush dev eth2 pre-up ip addr flush dev eth4 post-down ip link set eth4 down post-down ip link set eth2 down post-down ip link set eth3 down post-down ip link set br0 down post-down brctl delif br0 eth2 eth3 eth4 post-down brctl delbr br0 ifconfig -a br0 Link encap:Ethernet HWaddr 00:15:17:22:20:34 inet addr:192.168.0.102 Bcast:192.168.0.255 Mask:255.255.255.0 inet6 addr: fe80::215:17ff:fe22:2034/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:4957 errors:0 dropped:0 overruns:0 frame:0 TX packets:1077 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:596320 (596.3 KB) TX bytes:139952 (139.9 KB) eth4 Link encap:Ethernet HWaddr 00:60:dd:47:7c:05 inet addr:192.168.0.57 Bcast:192.168.0.255 Mask:255.255.255.0 inet6 addr: fe80::260:ddff:fe47:7c05/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:9000 Metric:1 RX packets:15391 errors:0 dropped:51 overruns:0 frame:0 TX packets:1207 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:5916769 (5.9 MB) TX bytes:154312 (154.3 KB) Interrupt:70 route -n Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 0.0.0.0 192.168.0.1 0.0.0.0 UG 0 0 0 eth4 0.0.0.0 192.168.0.1 0.0.0.0 UG 100 0 0 br0 169.254.0.0 0.0.0.0 255.255.0.0 U 1000 0 0 br0 192.168.0.0 0.0.0.0 255.255.255.0 U 0 0 0 br0 192.168.0.0 0.0.0.0 255.255.255.0 U 1 0 0 eth4

    Read the article

  • Phone number in meta description bad or good for local rankings NAP

    - by bybe
    Once again I'm at it with increasing people's local rankings and I've learnt so much about local rankings in the past 2 weeks it feels like my brain is gonna pop anyway, question is fairly simple for someone who engages in local rankings and I appreciate the question may be a little guess work but isn't SEO mostly guessing anyway? From what I've read and learned that Google works of a system called nap for local rankings (With many other factors but this question is purely based on NAP). For people who care about local rankings NAP stands for Name of Business / Address of Business / Phone Number for Business. Now what what I've read you don't need the whole NAP to be on one website, a P or just a N can help towards your local rankings. It's believed that NAP rewards more than just P and N for example but knowing Google they might have a diversity checker which is my concern what your get to in a moment. Now of course sites weight differently where your business is posted, it's certainly going to be more credible if your NAP details are on your national phone book than say a blog site, so taking in this consideration too. Pure Guess (Not apart of the question but none the less makes a good read on my belief). Now my guess work would make me believe that the formula would look something like (N)+(A)+(P)x(T) So (N)name would be 1 or 0 to indicate present or not So (A)dress would be 1 or 0 to indicate present or not So (P)hone would be 1 or 0 to indicate present or not So (T)rust would be 1-100 to indicate level of trust So a phone number appearing on youtube might look something like 0+0+1x95= 95 and a NAP appearing on your national phone book might look something like 1+1+1x100= 300 Please note that I'm not saying this is the sole factor and I'm sure its way more complex that this with things like other factors on the page, off the page (Reviews, Links, Clicks) and so on but its still a contributor). The Question My question is fairly simple and I'd imagine hard to impossible to have an actual definite answer to this but maybe someone has seen official wording else where on this, is it bad to include address or phone number in the Meta Description? The reason I ask is that one of my competitors has these elements in the meta descriptions and their local rankings are absolutely superb, the problem I have with this is scrap bot sites like 'Similar Too' 'Seo Rankings' and 1,000's of the other scrap box networks that scrap site and then make urls with your site information are mostly limited to your meta description what this means that your phone number, address and sometimes even your company name if the domain is exact will appear as AP, and even NAP on thousands of websites. So, is it a bad strategy to include phone number and address in meta description, everything I read into would suggest its good of course with the downside of maybe lowering quality of description for click thoughts but top rankings would increase this 10 folds anyhow..

    Read the article

  • Misaligned Display on Resume

    - by Shaun Killingbeck
    I have an odd issue with my laptop display when resuming from suspend. When I have an additional monitor connected there is no issue. However without an additional monitor connected, after resuming only the left 10% of the laptop screen (just enough to show the Unity Launcher and a bit more) is visibly working, although strangely in a screenshot this same 10% is shown on the right hand side of the screenshot: I ran xrandr --verbose before and after resume, and the only difference (using diff) was: 2c2 < LVDS connected 1366x768+0+0 (0x98) normal (normal left inverted right x axis y axis) 344mm x 194mm --- > LVDS connected 1366x768+1280+0 (0x98) normal (normal left inverted right x axis y axis) 344mm x 194mm This seems to suggest the screen position has been shifted by 1280 horizontally, the width of the second monitor I use. Indeed, running the command xrandr --output LVDS --pos 0x0 does bring the screen back to normal. However, I don't want to have to run this command every time, I'd prefer to cure the source of the problem than just correct the symptoms. Any ideas on how to get Ubuntu to keep the display configuration settings from before suspend when it resumes? or why it changes at all? Heres some technical details that might be pertinent: HP Pavilion DV6 Laptop Ubuntu 13.04 AMD Radeon HD 6400M Series AMD Radeon HD 6520G Using proprietary flgrx-updates driver and amdcccle (Catalyst Control Center) (Unfortunately the open source driver causes my laptop to run even hotter than it already does, otherwise I'd use that) The contents of Xorg.conf: Section "ServerLayout" Identifier "amdcccle Layout" Screen 0 "amdcccle-Screen[0]-0" 0 0 EndSection Section "Module" Load "glx" EndSection Section "Monitor" Identifier "0-LVDS" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1280x768" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Monitor" Identifier "0-CRT1" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1280x768" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Monitor" Identifier "1-LVDS" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "TargetRefresh" "60" Option "Position" "1280 0" Option "Rotate" "normal" Option "Disable" "false" Option "PreferredMode" "1366x768" EndSection Section "Monitor" Identifier "1-CRT1" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" Option "PreferredMode" "1280x1024" EndSection Section "Device" Identifier "amdcccle-Device[0]-0" Driver "fglrx" Option "Monitor-LVDS" "1-LVDS" Option "Monitor-CRT1" "1-CRT1" BusID "PCI:0:1:0" EndSection Section "Device" Identifier "amdcccle-Device[0]-1" Driver "fglrx" Option "Monitor-LVDS" "1-LVDS" BusID "PCI:0:1:0" Screen 1 EndSection Section "Screen" Identifier "Default Screen" DefaultDepth 24 EndSection Section "Screen" Identifier "amdcccle-Screen[0]-0" Device "amdcccle-Device[0]-0" DefaultDepth 24 SubSection "Display" Viewport 0 0 Virtual 2646 2646 Depth 24 EndSubSection EndSection Section "Screen" Identifier "amdcccle-Screen[0]-1" Device "amdcccle-Device[0]-1" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection

    Read the article

  • Restoring MSDB

    - by David-Betteridge
    We recently performed a disaster recovery exercise which included the restoration of the MSDB database onto our DR server.  I did a quick google to see if there were any special considerations and found the following MS article.  Considerations for Restoring the model and msdb Databases (http://msdn.microsoft.com/en-us/library/ms190749(v=sql.105).aspx).   It said both the original and replacement servers must be on the same version,  I double-checked and in my case they are both SQL Server 2008 R2 SP1 (10.50.2500).. So I went ahead and stopped SQL Server agent, restored the database and restarted the agent.  Checked the jobs and they were all there, everything looked great, and was until the server was rebooted a few days later.Then the syspolicy_purge_history job started failing on the 3rd step with the error message “Unable to start execution of step 3 (reason: The PowerShell subsystem failed to load [see the SQLAGENT.OUT file for details]; The job has been suspended). The step failed.”   A bit more googling pointed me to the msdb.dbo.syssubsystems table SELECT * FROM msdb.dbo.syssubsystems WHERE start_entry_point ='PowerShellStart'   And in particular the value for the subsystem_dll. It still had the path to the SQLPOWERSHELLSS.DLL but on the old server. The DR instance has a different name to the live instance and so the paths are different.   This was quickly fixed with the following SQL Use msdb; GO sp_configure 'allow updates', 1 ; RECONFIGURE WITH OVERRIDE ; GO UPDATE msdb.dbo.syssubsystems SET subsystem_dll='C:\Program Files\Microsoft SQL Server\MSSQL10_50.DR\MSSQL\binn\SQLPOWERSHELLSS.DLL' WHERE start_entry_point ='PowerShellStart'; GO sp_configure 'allow updates', 0; RECONFIGURE WITH OVERRIDE ; GO Stopped and started SQL Server agent and now the job completes.   I then wondered if anything else might be broken, SELECT subsystem_dll FROM msdb.dbo.syssubsystems Shows a further 10 wrong paths – fortunately for parts of SQL (replication, SSIS etc) we aren’t using! Lessons Learnt 1.       DR exercises are a good thing! 2.       Keep the Live and DR environments as similar as possible.    

    Read the article

  • Correct way to handle path-finding collision matrix

    - by Xander Lamkins
    Here is an example of me utilizing path finding. The red grid represents the grid utilized by my A* library to locate a distance. This picture is only an example, currently it is all calculated on the 1x1 pixel level (pretty darn laggy). I want to make it so that the farther I click, the less accurate it will be (split the map into larger grid pieces). Edit: as mentioned by Eric, this is not a required game mechanic. I am perfectly fine with any method that allows me to make this accurate while still fast. This isn't the really the topic of this question though. The problem I have is, my current library uses a two dimensional grid of integers. The higher the number in a cell, the more resistance for that grid tile. Currently I'm setting all unwalkable spots to Integer Max. Here is an example of what I want: I'm just not sure how I should set up the arrays of integers of the grid. Every time an element is added/removed to/from the game, it's collision details are updated in the table. Here is a picture of what the map looks like on my collision layer: I probably shouldn't be creating new arrays every time I have to do a path find because my game needs to support tons of PF at the same time. Should I have multiple arrays that are all updated when the dynamic elements are updated (a building is built/a building is destroyed). The problem I see with this is that it will probably make the creation and destruction of buildings a little more laggy than I would want because it would be setting the collision grid for each built in accuracy level. I would also have to add more/remove some arrays if I ever in the future changed the map size. Should I generate the new array based on an accuracy value every time I need to PF? The problem I see with this is that it will probably make any form of PF just as laggy because it will have to search through a MapWidth x MapHeight number of cells to shrink it all down. Or is there a better way? I'm certainly not the best at optimizing really anything. I've just started dealing with XNA so I'm not used to having optimization code really doing much of an affect until now... :( If you need code examples, please ask. I'll add it as an edit. EDIT: While this doesn't directly relate to the question, I figure the more information I provide, the better. To keep your units from moving as accurately to the players desired position, I've decided that once the unit PFs over to the less accurate grid piece, it will then PF on a more accurate level to the exact position requested.

    Read the article

  • Who should control navigation in an MVVM application?

    - by SonOfPirate
    Example #1: I have a view displayed in my MVVM application (let's use Silverlight for the purposes of the discussion) and I click on a button that should take me to a new page. Example #2: That same view has another button that, when clicked, should open up a details view in a child window (dialog). We know that there will be Command objects exposed by our ViewModel bound to the buttons with methods that respond to the user's click. But, what then? How do we complete the action? Even if we use a so-called NavigationService, what are we telling it? To be more specific, in a traditional View-first model (like URL-based navigation schemes such as on the web or the SL built-in navigation framework) the Command objects would have to know what View to display next. That seems to cross the line when it comes to the separation of concerns promoted by the pattern. On the other hand, if the button wasn't wired to a Command object and behaved like a hyperlink, the navigation rules could be defined in the markup. But do we want the Views to control application flow and isn't navigation just another type of business logic? (I can say yes in some cases and no in others.) To me, the utopian implementation of the MVVM pattern (and I've heard others profess this) would be to have the ViewModel wired in such a way that the application can run headless (i.e. no Views). This provides the most surface area for code-based testing and makes the Views a true skin on the application. And my ViewModel shouldn't care if it displayed in the main window, a floating panel or a child window, should it? According to this apprach, it is up to some other mechanism at runtime to 'bind' what View should be displayed for each ViewModel. But what if we want to share a View with multiple ViewModels or vice versa? So given the need to manage the View-ViewModel relationship so we know what to display when along with the need to navigate between views, including displaying child windows / dialogs, how do we truly accomplish this in the MVVM pattern?

    Read the article

  • Windows8, JavaScript and HTML5 - A good thing?

    - by Albers
    Most of us have seen the Windows 8 news regarding support for native HTML5/JavaScript applications. The press has pushed this as a potential threat to the .NET developer community because JavaScript and HTML5 were called "our new developer platform". The press release refers to "Web-connected and Web-powered apps built using HTML5 and JavaScript that have access to the full power of the PC.".Microsoft has also been hush on details related to these comments. Before we buy the hype and start worrying about a world where we drop our Visual Studio licenses and buy DreamWeaver - let's think about how Windows 8 HTML/JavaScript applications would be implemented. The HTML5 spec offers support for offline applications, but this won't offer the OS-integrated experience the press release refers to. MS has to be planning a way to extend access beyond the traditional JavaScript feature set. Microsoft has a similar option today: HTML Applications or HTAs. They come close to required features, but HTAs need ActiveX or Java integration to provide the promised OS-level access. I'm guessing that Microsoft's future OS strategy isn't built on developers cranking out ActiveX controls or Java applets. So where is Microsoft headed? One possibility is that MS builds a new JavaScript framework from the ground up outside their current APIs. Another idea would be for Microsoft to add support for JavaScript as a first class .NET language using the Dynamic Language Runtime. A solution based on the DLR could be integrated into an HTA-like model to provide the promised access, along with the full range of features in .NET Framework. Security comes included in the Framework. And the work necessary to support this integration would tie in nicely with the effort MS has recently made providing better JavaScript and HTML5 support in Visual Studio 2010. As a bonus, a full-fledged JavaScript DLR implementation would allow single language web solutions across client and server (think node.js) and would appeal to developers who are familiar with JavaScript but have less experience with the Microsoft tech stack. We will all get a better picture after the Build conference in September. But in the mean time we know that Microsoft has a reputation for providing strong developer support. We might want to reserve our harshest judgement and consider that the press release could hint at new opportunities for .NET development.

    Read the article

  • The Uganda .NET Usergroup meeting for January 2011 - a look back.

    - by Malisa L. Ncube
    We had a very interesting meeting on Friday 28th last week. We had 10 attendees and two speakers. The first topic presented was Cloud Computing, presented by Allan Rwakatungu @arwakatungu who works with MTN Uganda. He gave a very brilliant outline of how Cloud computing and service oriented applications had begun changing the platform for operating business and the costs it saves because of scalability and elasticity. He went on to demonstrate the steps you would take if you are beginning a new Windows Azure project. He explained the history and evolution of the Windows Azure, SQL Azure and cloud services offered by Amazon and google.com. The attendees had many questions to ask (obviously), but they were all answered very well. We once again thank Allan, for taking time to prepare the presentation and demonstrating for us. We recorded a video on the entire presentation and after doing some editing we will publish it. One wish which was echoed by most members was that Microsoft should open the cloud services and development for Africa. Microsoft currently does not even have servers here in Africa and so far, that does not put African developers in the same platform as other developers in other continents. Now is the time considering the improvements in network speeds and joining of the Seacom network and broadband.   I presented on Parallelism and Multithreading using .NET 4.0, I also gave some details on the language changes in C# 5.0 and the async keyword and the TaskEx class. I explained the Task, Scheduling of parallel tasks and demonstrated problems that may arise from using parallelism inappropriately. I also demonstrated the performance improvements that may be achieved by taking advantage of multi-core processors. You may download the presentation on Parallelism and Multi-threading from here. The resolution of the meeting was that we should meet more than once a month and begin other activities which should be more fun. e.g. Geek Dinner, Geek Beer or CodeCamp. Based on that we all agreed we shall have a mid-month meeting starting from February. Cheers folks! del.icio.us Tags: .net,usergroup,cloud computing,parallelism,multi-threading

    Read the article

  • Games at Work Part 1: Introduction to Gamification and Applications

    - by ultan o'broin
    Games Are Everywhere How many of you (will admit to) remember playing Pong? OK then, do you play Angry Birds on your phone during work hours? Thought about why we keep playing online, video, and mobile games and what this "gamification" business we're hearing about means for the enterprise applications user experience? In Reality Is Broken: Why Games Make Us Better and How They Can Change the World, Jane McGonigal says that playing computer and online games now provides more rewards for people than their real lives do. Games offer intrinsic rewards and happiness to the players as they pursue more satisfying work and the success, social connection, and meaning that goes with it. Yep, Gran Turismo, Dungeons & Dragons, Guitar Hero, Mario Kart, Wii Boxing, and the rest are all forms of work it seems. Games are, in fact, work taken so seriously that governments now move to limit the impact of virtual gaming currencies on the real financial system. Anyone who spends hours harvesting crops on FarmVille realizes it’s hard work too. Yet games evoke a positive emotion in players who voluntarily stay engaged with games for hours, day after day. Some 183 million active gamers in the United States play on average 13 hours per week. Weekly, 5 million of those gamers play for longer than a working week (45 hours). So why not harness the work put into games to solve real-world problems? Or, in the case of our applications users, real-world work problems? What’s a Game? Jane explains that all games have four defining traits: a goal, rules, a feedback system, and voluntary participation. We need to look at what motivational ideas behind the dynamics of the game—what we call gamification—are appropriate for our users. Typically, these motivators are achievement, altruism, competition, reward, self-expression, and status). Common game techniques for leveraging these motivations include: Badging and avatars Points and awards Leader boards Progress charts Virtual currencies or goods Gifting and giving Challenges and quests Some technology commentators argue for a game layer on top of everything, but this layer is already part of our daily lives in many instances. We see gamification working around us already: the badging and kudos offered on My Oracle Support or other Oracle community forums, becoming a Dragon Slayer implementor of Atlassian applications, being made duke of your favorite coffee shop on Yelp, sharing your workout details with Nike+, or donating to Japanese earthquake relief through FarmVille, for example. And what does all this mean for the applications that you use in your work? Read on in part two...

    Read the article

  • Improving the performance of JDeveloper11g (part 2) and JVMs in general

    - by asantaga
    Just received an email from one of our JVM developers who read my blog entry on Performance tuning JDeveloper11g and he's confirmed that all of the above parameters are totally supported :-) He's also provided a description of the parameters so we can learn what magic is actually being applied. - -XX:+AggressiveOpts -- this enables the latest and greatest JVM optimizations. It will likely help most Java applications. It's fully supported. The downside of it is that because it has the latest and greatest optimizations, there is some small probability that it may not offer as good of an experience. As those features enabled with this command line option have "matured", they are made the default in a future JDK release. So, you can think of this command line option as the place where the newest optimizations get introduced. Some time later they are moved out from under AggressiveOpts to become default behavior. -XX:+OptimizeStringConcat -- only works with the -server JVM. It may be enabled by the default in a future JDK 7 update release. This option delays the construction of a StringBuilder/StringBuffer and attempts to avoid re-sizing the underlying char[] by attempting to detect the size of the char[] to allocate based on what's being appended to the StringBuilder/StringBuffer. -XX:+UseStringCache -- I would not suggest using this unless you knew that JDeveloper allocated the same string over and over again. And, the string that's allocated over and over again is one of the first 100,000 allocated strings. In short, I'd recommend against using it. And, in fact, in Java 7 (currently) does not include this feature. -XX:+UseCompressedOops -- applicable to 64-bit JVMs. And, if you're using a 64-bit JVM, I'd suggest you use it. It's auto enabled in JDK 7 64-bit JVMs and later JDK 6 64-bit JVMs enable it by default too. -XX:+UseGCOverheadLimit -- by default this option is already enabled. One other command line option to consider is -XX:+TieredCompilation for a JDK 6 Update 25 or later, or JDK 7. This gives you the startup of a -client JVM and the peak performance of a -server JVM. Awesome-ness!  Finally, Charlies also pointed out to me a "new" book he's just published where he goes into the details of JVM tuning, a must for all Fusion Middleware tuning exercises..  (click the book)  Thanks Charlie!

    Read the article

  • Partner Webcast – Oracle SOA Suite 12c: Connect 4 Cloud, Mobile, IoT with On-premise - August 28th 2014

    - by JuergenKress
    Thursday August 28th 2014 SOA Suite 12c Webcast The pace of new business projects continues to grow from increasing customer self-service to seamlessly connecting all your back office and in-the-field applications. At the same time increased integration complexity may seem inevitable as organizations are suddenly faced with the requirement to support three new integration challenges: » Cloud Integration - integrate with the cloud, rapidly integrate a growing list of cloud applications with existing applications » Mobile Integration - the urgency to mobile-enable existing applications » IoT Integration - begin development on the latest trend of connecting Internet of Things (IoT) devices to your existing infrastructure. Join this webcast to get an overview of what is in Java 8 from a business perspective and how with Java 8, you are uniquely positioned to extend innovation in your solutions through the largest, open, standards-based, community-driven platform. Oracle SOA Suite 12c Oracle SOA Suite 12c, the latest version of the industry’s most complete and unified application integration and SOA solution, aims to simplify, accelerate and optimize integrations. Oracle SOA Suite 12c and its associated products, Oracle Managed File Transfer, Oracle Cloud and Application Adapters, B2B and healthcare integration, offer the industry’s most highly integrated platform for solving the increased integration challenges. Oracle SOA Suite 12c is a complete, integrated and best-of-breed platform. It enables next generation integration capabilities through A unified toolset for the development of services and composite applications. A standards-based platform that is service enabled and easily consumable by modern web applications, allowing enterprises to quickly and easily adapt to changes in their business and IT environments. Greater visibility, controls and analytics to govern how services and processes are deployed, reused and changed across their entire lifecycle. Join us to find out more about the new features of Oracle SOA Suite 12c and how it enables you to reduce time to market for new project integration and to reduce integration cost and complexity. Oracle SOA Suite is the ability to simplify by integrating the disparate requirements of cloud, mobile, and IoT devices with existing on-premise applications. Agenda: Oracle SOA Suite 12c new Features Cloud Integration Mobile Enablement Interent of Things (IoT) Summary - Q&A For details please visit our registration page here. Thursday, Aug 28th 2014 10am CET  (9am GMT / 11am EEST SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: SOA Suite 12c,Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress,SOA

    Read the article

  • NEW: Oracle Certification Exam Preparation Seminars

    - by Harold Green
    Hi Everyone, I am really excited about a new offering that we are announcing this week - Oracle Certification Exam Preparation Seminars. These are something that will make a big difference for many of you in your efforts to become certified and move your career forward. They are also something that have previously only been available (but very popular) to the limited number of customers who have attended our annual conferences in San Francisco (Oracle OpenWorld and JavaOne). These are the first in a series of offerings that we are releasing over the next few months. So for those of you either preparing or considering Oracle certification - keep watching here on the blog, Facebook, Twitter and the Oracle Certification website for additional announcements related to our most popular certification areas. Details of the new Exam Preparation Seminars are found below: NEW: ORACLE CERTIFICATION EXAM PREPARATION SEMINARS Becoming Oracle certified is a great way to build your career, gain additional credibility and improve your earning power. We know that the decision to become certified is not trivial. Our surveys indicate that people consider their time investment a critical factor in their decision to become certified. Your time is important. In order to help candidates maximize the efficiency of their study time we are releasing a new series of video-based seminars called Exam Preparation Seminars. These seminars are patterned after the extremely popular Exam Cram sessions that until now have only been available at our annual customer conferences (Oracle Open World and JavaOne). Beginning today they are now available to anyone, anywhere as a part of this Exam Prep Seminar series. Features: Fast-paced objective by objective review of the exam topics - led by top Oracle University instructors 24/7 access through Oracle University's training on demand platform. Ability to re-watch all or part of the the seminar. All the conveniences of video-based training: start, stop, fast-forward, skip, rewind, review. Tips that will help you better understand what you need to know to pass the exam. The Exam Preparation Seminars are meant to help anyone with a working knowledge of the technology get that extra boost to help them finalize their preparation, and will help anyone who wants a better understanding of the the depth and breadth of the exam topics and objectives. Benefits: Save time by understanding what you should study. Makes you efficient because you will understand the breadth and depth of each of the exam topics. Helps you create a better, more efficient study plan. Improves your confidence in your skills and ability to pass the certification exam. Exam Preparation Seminars are available individually, or in convenient Value Packages (which include the Exam Preparation Seminar, and an exam voucher which includes one free-retake if you need it). Currently we are releasing two seminars - one for DBA SQL and one for DBA Administration I. Additional offerings are in process. Find out more: General WEB: Oracle Certification Exam Preparation Seminars VIDEO: Exam Preparation Seminars Promo (1:27) Oracle Database Administration I (11g, 10g) VIDEO: Instructor Introduction (1:08) VIDEO: Sample Video (2:16) Oracle Database SQL VIDEO: Instructor Introduction (1:08) VIDEO: Sample Video (2:16)

    Read the article

  • can't update nvida having error near the end of the install

    - by user94843
    I had just got Ubuntu (first timer to Ubuntu so be very descriptive). I think there a problem with my Nvida update it won't let me update it. This is the name of the update in update manager NVIDIA binary xorg driver, kernel module and VDPAU library. When i attempt to install it, it starts out fine but near the end i get a window titaled package operation failed with these under the details installArchives() failed: Setting up nvidia-current (295.40-0ubuntu1) ... update-initramfs: deferring update (trigger activated) INFO:Enable nvidia-current DEBUG:Parsing /usr/share/nvidia-common/quirks/put_your_quirks_here DEBUG:Parsing /usr/share/nvidia-common/quirks/dell_latitude DEBUG:Parsing /usr/share/nvidia-common/quirks/lenovo_thinkpad DEBUG:Processing quirk Latitude E6530 DEBUG:Failure to match Gigabyte Technology Co., Ltd. with Dell Inc. DEBUG:Quirk doesn't match DEBUG:Processing quirk ThinkPad T420s DEBUG:Failure to match Gigabyte Technology Co., Ltd. with LENOVO DEBUG:Quirk doesn't match Removing old nvidia-current-295.40 DKMS files... Loading new nvidia-current-295.40 DKMS files... Error! DKMS tree already contains: nvidia-current-295.40 You cannot add the same module/version combo more than once. dpkg: error processing nvidia-current (--configure): subprocess installed post-installation script returned error exit status 3 Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for initramfs-tools ... update-initramfs: Generating /boot/initrd.img-3.2.0-31-generic Warning: No support for locale: en_US.utf8 Errors were encountered while processing: nvidia-current Error in function: Setting up nvidia-current (295.40-0ubuntu1) ... update-initramfs: deferring update (trigger activated) INFO:Enable nvidia-current DEBUG:Parsing /usr/share/nvidia-common/quirks/put_your_quirks_here DEBUG:Parsing /usr/share/nvidia-common/quirks/dell_latitude DEBUG:Parsing /usr/share/nvidia-common/quirks/lenovo_thinkpad DEBUG:Processing quirk Latitude E6530 DEBUG:Failure to match Gigabyte Technology Co., Ltd. with Dell Inc. DEBUG:Quirk doesn't match DEBUG:Processing quirk ThinkPad T420s DEBUG:Failure to match Gigabyte Technology Co., Ltd. with LENOVO DEBUG:Quirk doesn't match Removing old nvidia-current-295.40 DKMS files... Loading new nvidia-current-295.40 DKMS files... Error! DKMS tree already contains: nvidia-current-295.40 You cannot add the same module/version combo more than once. dpkg: error processing nvidia-current (--configure): subprocess installed post-installation script returned error exit status 3 Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for initramfs-tools ... update-initramfs: Generating /boot/initrd.img-3.2.0-31-generic Warning: No support for locale: en_US.utf8

    Read the article

  • Java JRE 1.6.0_35 Certified with Oracle E-Business Suite

    - by Steven Chan (Oracle Development)
    The latest Java Runtime Environment 1.6.0_35 (a.k.a. JRE 6u35-b10) is now certified with Oracle E-Business Suite Release 11i and 12 desktop clients.   What's new in Java 1.6.0_35?See the 1.6.0_35 Update Release Notes for details about what has changed in this release.  This release is available for download from the usual Sun channels and through the 'Java Automatic Update' mechanism. 32-bit and 64-bit versions certified This certification includes both the 32-bit and 64-bit JRE versions. 32-bit JREs are certified on: Windows XP Service Pack 3 (SP3) Windows Vista Service Pack 1 (SP1) and Service Pack 2 (SP2) Windows 7 and Windows 7 Service Pack 1 (SP1) 64-bit JREs are certified only on 64-bit versions of Windows 7 and Windows 7 Service Pack 1 (SP1). Worried about the 'mismanaged session cookie' issue? No need to worry -- it's fixed.  To recap: JRE releases 1.6.0_18 through 1.6.0_22 had issues with mismanaging session cookies that affected some users in some circumstances. The fix for those issues was first included in JRE 1.6.0_23. These fixes will carry forward and continue to be fixed in all future JRE releases.  In other words, if you wish to avoid the mismanaged session cookie issue, you should apply any release after JRE 1.6.0_22.All JRE 1.6 releases are certified with EBS upon release Our standard policy is that all E-Business Suite customers can apply all JRE updates to end-user desktops from JRE 1.6.0_03 and later updates on the 1.6 codeline.  We test all new JRE 1.6 releases in parallel with the JRE development process, so all new JRE 1.6 releases are considered certified with the E-Business Suite on the same day that they're released by our Java team.  You do not need to wait for a certification announcement before applying new JRE 1.6 releases to your EBS users' desktops. Important For important guidance about the impact of the JRE Auto Update feature on JRE 1.6 desktops, see: URGENT BULLETIN: All E-Business Suite End-Users Must Manually Apply JRE 6 Updates References Recommended Browsers for Oracle Applications 11i (Metalink Note 285218.1) Upgrading Sun JRE (Native Plug-in) with Oracle Applications 11i for Windows Clients (Metalink Note 290807.1) Recommended Browsers for Oracle Applications 12 (MetaLink Note 389422.1) Upgrading JRE Plugin with Oracle Applications R12 (MetaLink Note 393931.1) Related Articles Mismanaged Session Cookie Issue Fixed for EBS in JRE 1.6.0_23 Roundup: Oracle JInitiator 1.3 Desupported for EBS Customers in July 2009

    Read the article

  • Ubuntu 12.04 share the internet over WiFi from wvdial?

    - by Sour Lemon
    I have just installed Ubuntu 12.04 on a separate partition on my hard drive so I can dual boot to either Windows 7 or Ubuntu. I am living in Japan and so I'm using a mobile broadband USB device called "Softbank C02LC". By default it seems that this device isn't recognised so I did the following: Terminal: sudo su nano /usr/bin/usbModemScript Nano: #!/bin/bash echo 1c9e 9900 > /sys/bus/usb-serial/drivers/option1/new_id Terminal: chmod +x /usr/bin/usbModemScript nano /etc/udev/rules.d/option.rules Nano: ATTRS{idVendor}=="1c9e", ATTRS{idProduct}=="9900", RUN+="/usr/bin/usbModemScript" ATTRS{idVendor}=="1c9e", ATTRS{idProduct}=="9900", RUN+="/sbin/modprobe option" which made the device visible from the network manager etc. However even though I set up my details correctly when I created a new connection (Correct username, APN etc) as soon as I try to connect it almost immediately disconnects. Because of this I then followed the instructions at this site: http://debugitos.main.jp/index.php?Ubuntu%2F%A5%E2%A5%D0%A5%A4%A5%EB%A5%A4%A5%F3%A5%BF%A1%BC%A5%CD%A5%C3%A5%C8 And I ended up using the c02lc_connect script at the bottom of the page to connect to the internet. The file contains the following bash script: #!/bin/sh usbinterfece=/dev/ttyUSB2 VID=1c9e PID=9900 WRONG_PID=f000 LSUSB=/usr/sbin/lsusb GREP=/bin/grep MODPROBE=/sbin/modprobe SWITCH=/usr/sbin/usb_modeswitch SWITCH_D=/etc/usb_modeswitch.d WVDIAL=/usr/bin/wvdial SLEEP=/bin/sleep SUDO=/usr/bin/sudo WHICH=/usr/bin/which switch_config="$SWITCH_D/$VID:$WRONG_PID" if ! [ -x $WVDIAL -a -x $SWITCH ]; then echo "Install wvdial and usb_modeswitch." exit 0 fi check_usb() { local vid="$1" local pid="$2" ($LSUSB | $GREP "$vid:$pid") } if ! (check_usb "$VID" "$PID"); then echo "Cannot find modem device..." if (check_usb "$VID" "$WRONG_PID") && ( [ -f "$switch_config" ] ); then echo "The device is attached but its mode is wrong." echo "Try usb_modeswitch..." $SUDO $SWITCH -c "$switch_config" $SLEEP 1 if (check_usb "$VID" "$PID"); then echo "Successfully switched the mode." else echo "Failed to switch the mode..." exit 1 fi else exit 1 fi fi if [ ! -c "$usbinterface" ]; then $SUDO $MODPROBE usbserial vendor=0x$VID product=0x$PID $SLEEP 2 fi $SUDO $WVDIAL which works completely fine - no problems what-so-ever. But we also have 1 more laptop here which I need to share the internet connection with. In Windows 7 I do this with the Connectify program, and in Ubuntu I have seen that you can do things like set up hotspots etc. But because I am using WvDial I am not sure how I would share the internet. I am only beginning to use Ubuntu but unfortunately until I can figure out how to share the internet over WiFi when connected via WvDial I have to stick with Windows. If you have any ideas on how to do this it would be much appreciated!

    Read the article

  • The Other Side of XBRL

    - by john.orourke(at)oracle.com
    With the United States SEC's mandate for XBRL filings entering its third year, and impacting over 7000 additional companies in 2011, there's a lot of buzz in the industry about how companies should address the new reporting requirements.  Should they outsource the XBRL tagging process to a third party publisher, handle the process in-house with a bolt-on XBRL tool, or should they integrate XBRL tagging with the financial close and reporting process?  Oracle is recommending the latter approach, in fact  here's a link to a recent webcast that I did with CFO.com on this topic: http://www.cfo.com/webcasts/index.cfm/l_eventarchive/14548560 But production of XBRL-based filings is only half of the story. The other half is consumption of XBRL by regulators, academics, financial analysts and investors.  As I mentioned in my December article on the XBRL US conference, the feedback from these groups is that they are not really leveraging XBRL for analysis of companies due to a lack of tools and historic XBRL-based data on public companies.   The good news here is that the historic data problem is getting better as large, accelerated filers enter their third year of XBRL filings.  And the situation is getting better on the reporting and analysis tools side of the equation as well - and Oracle is leading the way. In early January, Oracle released the Oracle XBRL Extension for Oracle Database 11g.  This is a "no cost option" on top of the latest Oracle Database 11.2.0.2.0 release. With this added functionality organizations will have the ability to create one or more back-end XBRL repositories based on Oracle Database, which provide XBRL storage and query-ability with a set of XBRL-specific services.  The XBRL Extension to Oracle XML DB integrates easily with Oracle Business Intelligence Suite Enterprise Edition (OBIEE) for analytics and with interactive development environments (IDEs) and design tools for creating and editing XBRL taxonomies. The Oracle XBRL Extension to Oracle Database 11g should be attractive to regulators, stock exchanges, universities and other organizations that need to collect, analyze and disseminate XBRL-based filings.  It should also be attractive to organizations that produce XBRL filings, and need a way to store and compare their own XBRL-based financial filings to those of their peers and competitors. If you would like more information, here's a link to a web page on the Oracle Technology Network with the details about Oracle XBRL Extension for Oracle Database 11g, including data sheet, white paper, presentation, demos and other information: http://www.oracle.com/technetwork/database/features/xmldb/index-087631.html

    Read the article

  • Oracle Customer Success Forum - Batesville - Oracle Sales Cloud - June 24th, 5pm CET

    - by Richard Lefebvre
    Batesville uses Oracle Sales Cloud to create a common platform and standardize processes for business transformation across field sales and telesales. Using real-time KPI dashboards, they are measuring their business success with consistency across their sales reps.We are pleased to invite you to a discussion with Batesville on industry trends, why sales automation is important, reasons for choosing Oracle Sales Cloud, and the vendor evaluation process. Please click on the register button to confirm your attendance by 5:00 p.m. Pacific Time on June 23, 2014.Speakers: Diane Kinker, Director CRM Program Chris Haven, Senior Director Product Management, Oracle (Moderator) Organization Profile:Batesville (www.Batesville.com), a wholly owned subsidiary of Hillenbrand, Inc. (NYSE:HI), is the leader in the North American death care industry. For more than 125 years, Batesville has been dedicated to helping families honor the lives of those they love®. Batesville’s innovation has changed the face of funeral service, from advancements in manufacturing and quality to patented features and memorialization offerings, technology and web-based solutions, and profit-enhancing merchandising systems and room displays. Our history of manufacturing excellence, product innovation, superior customer service and reliable delivery has helped Batesville become – and remain – a market leader. Event Description:In this informal reference call, you will have the opportunity to hear Batesville discuss industry trends, why sales automation is important, the decision making process for choosing Oracle Sales Cloud, and the vendor evaluation process. The call will open with a brief overview, followed by discussion, and an open question and answer session. Please allow one hour for the call.Why Oracle:Batesville looked to transform its sales automation processes. Oracle Sales Cloud met these needs and Batesville’s requirements for: Standardized end-to-end Sales Processes including Sales Performance Management (territory management, quota management and incentive compensation) Mobile capabilities with integration to Microsoft Outlook and Smartphones Creation of the WIG Dashboard (Wildly Important Goal) using reporting and analytics Click the Register Now button to confirm your attendance for this informative event. Registration will close at 5:00 p.m. Pacific Time on June 23, 2014.After you register your information will be forwarded through an Approval Process. Once your registration request has been validated against the invitation database, you will receive an email confirmation with your registration details as long as there is availability. Please be advised that Batesville will revise the registrants list and may dismiss registrations as they see fit. Register Now!

    Read the article

  • Java EE 8 update

    - by delabassee
    Planning for Java EE 8 is now well underway. As you know, a few weeks ago, we conducted a three part Java EE 8 Community Survey (you can find the final summary here). The data gathered have been very influential for the next steps. You can now expect over the coming weeks and months to see updates on the various specifications that compose the Java EE platform. Some Specification Leads are busy gathering additional feedback regarding what they should focus their efforts on (e.g. CDI 2 survey). Other Specification Leads have already publicly exposed what they think should be one of the focus for the evolution of the specification they lead.  For example, adding Server-Senet Events (SSE) support in JAX-RS is being discussed here and adding MVC support is being discussed here. Please remember that the fact we are now discussing any feature does not insure that it will be included in the proposal, nor in any particular update to Java EE. We can expect additional enhancements, changes and evolutions as we get closer to the finalisation of the different specifications... and there is still a long way to go with these specification proposals! Linda DeMichiel, Java EE Co-Specification Lead, has recently posted a draft proposal for the Java EE 8 Platform specification. Linda's goal is to recruit people and companies supporting this proposal before submitting it to the JCP.  This draft proposal is very interesting reading as it contains relevant information on the plans for Java EE 8 such as : The themes: Support for the latest web standards (eg. HTTP 2.0)  Continue to work on ease of development Improve the infrastructure for cloud support Alignment with Java SE 8 New JSRs to be added to the platform: J-Cache Java API for JSON Binding Java Configuration Plans for the Web Profile Plans on technologies to prune in Java EE 8, ... So if you haven't done it yet, I really encourage you to read the Java EE 8 draft proposal! Our goal for the Java EE 8 specification is for it to be finalized in the second half of 2016. It is important to note that we are in the early days of Java EE 8 and at this stage everything (themes, content, timing, etc.) is preliminary. Everything still needs to be discussed, challenged and agreed within the different Java Community Process (JCP) Experts Groups (EGs). Some EGs that still need to be formed! It could also means that the roadmap will have to be adjusted to follow the progress being made in the different EGs. This is also a good occasion to remind you that participation within those upcoming JCP Experts Groups is encouraged. Contributing in an EG is an effective lever to influence what Java EE 8 will become! Finally, as things get more concrete, we will share details on how to engage in the different Java EE 8 related Adopt-a-JSR initiatives, another way to contribute. You can also read other posts related to Java EE 8, here at The Aquarium blog. Just look for articles with the 'javaee8' tag.

    Read the article

  • I owe you an explanation

    - by Blueberry Coder
    Welcome to my blog! I am Frédéric Desbiens, a new member of the ADF Product Management team.  I joined Oracle only a few weeks ago. My boss is Grant Ronald, and I have the privilege to work in the same team as Susan Duncan, Frank Nimphius, Lynn Munsinger and Chris Muir. I share with them a passion for all things Java and ADF. With this blog, I hope to help you be more successful with our products – whether you are a customer or a partner. You may have heard of me before. Maybe you have my book in your bookshelf; or maybe we met at a conference. I went to JavaOne, ODTUG Kaleidoscope and Oracle OpenWorld in the past, when I worked for a major consulting firm. I will spare you all the details of my career; you can have a look at my LinkedIn profile if you are curious about my past.  Usually, my posts will be of a technical nature, and will focus on Oracle ADF and Oracle JDeveloper. SOA and portals have always been two topics of interest for me, however, and I will write about them. Over time, you will probably get acquainted with my « strategic » side as well. I devour history books, and always had a tendency to look at the big picture. I will probably not resist to the temptation of mixing IT and history, but this will be occasional, I promise!  At this point, I owe you an explanation about the title of the blog. I am French-Canadian, and wanted to evoke my roots in an obvious yet unobtrusive way. I was born in Chicoutimi, which is one of the main cities found in the Saguenay-Lac-Saint-Jean region. Traditionally, a large part of the wild blueberry production of the province of Québec come from there. A common nickname for the inhabitants is thus Les Bleuets, « The Blueberries » in English. I hope to see you around. You can also follow me on Twitter under  @BlueberryCoder.

    Read the article

  • Need help identifing what resources (eg. In MIT OpenCourseWare) can help me prepare for a test [closed]

    - by jiewmeng
    I am entering uni soon. I can sit for a placement test to see if I elegible for exemptions. The details are http://www.comp.nus.edu.sg/undergraduates/TestScope11_12.html Or CS2100 Computer Organisation (please click title) The objective of this module is to familiarise students with the fundamentals of computing devices. Through this module students will understand the basics of data representation, and how the various parts of a computer work, separately and with each other. This allows students to understand the issues in computing devices, and how these issues affect the implementation of solutions. Topics covered include data representation systems, combinational and sequential circuit design techniques, assembly language, processor execution cycles, pipelining, memory hierarchy and input/output systems. Recommended Textbooks Digital Design: Principles and Practices [DDPP] by John F. Wakerly, Prentice-Hall. ISBN 0-13-324500-4. Computer Organizations and Design (The hardware/software interface) by David A. Patterson and John L. Hennessy. CS2105 Introduction to Computer Networks (please click title) This course aims to provide a broad introduction to computer networks and some appreciations of network application programming. It covers a range of topics including basic data communication and computer network concepts, protocols, networked computing concepts and principles, network applications development and network security. The emphasis of teaching is on the working principles and application of computer networks. As an integral part of the course, tutorials and practical assignments enforcing learning will also be given. These assignments provide an early exposure in network application programming and they should be able to complete by using personal computers and school's network facilities. Topics included: An overview of computer networks and the Internet Basic data communications Application layer Transport layer Network layer and routing Link layer and local area networks Recommended Textbook James F. Kurose & Keith W. Ross, Computer networking: A top-down approach featuring internet, Addison Wesley, 2001 I am wondering what resources eg. MIT OpenCourseWare or other universities resources are available to help he perpare for these particular modubles. I am thinking does the Networking one look like CCNA? The computer oragization. Its like electronics, assembly etc? I learnt some electronics in Poly but looking at the sample papers, uni looks very different... I have about 1 month to prepare if I want any chance of exempting from these modules :) any help?

    Read the article

  • Exalogic 2.0.1 Tea Break Snippets - Creating and using Distribution Groups

    - by The Old Toxophilist
    By default running your Exalogic in a Virtual provides you with, what to Cloud Users, is a single large resource and they can just create vServers and not care about how they are laid down on the the underlying infrastructure. All the Cloud Users will know is that they can create vServers. For example if we have a Quarter Rack (8 Nodes) and our Cloud User creates 8 vServers those 8 vServers may run on 8 distinct nodes or may all run on the same node. Although in many cases we, as Cloud Users, may not be to worried how the Virtualisation Algorithm decides where to place our vServers there are cases where it is extremely important that vServers run on distinct physical compute nodes. For example if we have a Weblogic Cluster we will want the Servers with in the cluster to run on distinct physical node to cover for the situation where one physical node is lost. To achieve this the Exalogic Virtualised implementation provides Distribution Groups that define and anti-aliasing policy that the underlying Virtualisation Algorithm will take into account when placing vServers. It should be noted that Distribution Groups must be created before you create vServers because a vServer can only be added to a Distribution Group at creation time. Creating A Distribution Group To create a Distribution Groups we will first need to select the Account in which we want the Distribution Group to be created. Once we have selected the account we will see the Interface update and Account specific Actions will be displayed within the Action Panes. From the Action pane (or Right-Click on the Account) select the "Create Distribution Group" action. This will initiate the create wizard as follows. Distribution Group Details Within the first Step of the Wizard we can specify the name of the distribution group and this should be unique. In addition we can provide a detailed description of the group. Distribution Group Configuration The second step of the configuration wizard allows you to specify the number of elements that are required within this group and will specify a maximum of the number of nodes within you Exalogic. At this point it is always better to specify a group with spare capacity allowing for future expansion. As vServers are added to group the available slots decrease. Summary Finally the last step of the wizard display a summary of the information entered.

    Read the article

  • An invitation to join a JDeveloper and ADF productivity clinic (and more!) at KScope

    - by Chris Muir
    Would you like a chance to influence Oracle's decisions on tool usability and productivity? If you're attending ODTUG's Kaleidoscope conference this year in San Antonio, Oracle would like to invite you to participate in our Usability Activity Research and separately our JDeveloper and ADF Productivity Clinics with our experienced user experience teams.  The teams are keen to hear what you have to say about your experiences with our tools in general and specifically JDeveloper and ADF.  The details of each event are described below. Invitation to Usability Activity - Sunday June 24th to Wednesday June 27th Oracle is constantly working on new tools and new features for developers, and invites YOU to become a key part of the process!  As a special addition to Kscope 12, Oracle will be conducting onsite usability research in the Alyssum room, from Sunday June 24 to Wednesday June 27. Usability activities are scheduled ahead of time for participants' convenience.  If you would like to take part, please fill out this form to let us know of the session(s) that you would like to attend and your development experience. You will be emailed with your scheduled session before the start of the conference. JDeveloper and ADF Productivity Clinic - Thursday June 28th Are you concerned that Java, Oracle ADF or JDeveloper is difficult? Is JDeveloper making you jump through hoops?  Do you hate a particular dialog or feature of JDeveloper? Well, come and get things off your chest! Oracle is hosting a product management and user experience clinic where we want to hear about your issues and concerns. What's difficult to use?  What doesn't work the way you want, and how would you want it to work?  What isn't behaving like your current favorite tool?  If we can't help on you the spot, we'll take your feedback and use it to improve the product experience.  A great opportunity to get answers, or get improvements. Drop by the Alyssum room, anytime from 8:30 to 10:30 on Thursday, June 28. We look forward to seeing you at KScope soon! 

    Read the article

< Previous Page | 255 256 257 258 259 260 261 262 263 264 265 266  | Next Page >