Search Results

Search found 20714 results on 829 pages for 'cruise release management'.

Page 460/829 | < Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >

  • Oracle VM Administration: Oracle VM Server for x86 - new Training

    - by Antoinette O'Sullivan
     Oracle VM Administration: Oracle VM Server for x86 - new course just released. This 3 day hands-on course teaches students how to build a virtualization platform using the Oracle VM Manager and Oracle VM Server for x86. Students learn how deploy and manage highly configurable, inter-connected virtual machines. The course teaches students how to install and configure Oracle VM Server for x86 as well as details of network and storage configuration, pool and repository creation, and virtual machine management. You can follow this class that brings you great hands-on experience either in-class or from your own desk.

    Read the article

  • Cannot set a credential for principal 'sa'

    - by hailey
    I was trying to change the SA password on my development server this morning and got an error. Msg 15535, Level 16, State 1, Line 1 Cannot set a credential for principal 'sa'. It was a little frustrating to get an error for a seemingly simple task but then agian maybe I screwed something up.  After doing a couple of searches i found a Microsoft KB (support.microsoft.com/kb/956177) "You receive an exception in SQL Server 2008 when you try to modify the properties of the SQL Server Administrator account by using SQL Server Management Studio".  It was for SQL 2008 but it worked for my SQL 2005 sp3 server just fine.  You have to click the Map to Credential check box but you don't have to add any credetials just click the OK button to complete and that's it.

    Read the article

  • UNHCR and Stanyslas Matayo Receive Duke's Choice Award 2012

    - by Geertjan
    This year, NetBeans Platform applications winning Duke's Choice Awards were not only AgroSense, by Ordina in the Netherlands, and the air command and control system by NATO... but also Level One, the UNHCR registration and emergency management system. Unfortunately, Stanyslas Matayo, the architect and lead engineer of Level One, was unable to be at JavaOne to receive his award. It would have been really cool to meet him in person, of course, and he would have joined the NetBeans Party and NetBeans Day, as well as the NetBeans Platform panel discussions that happened at various stages throughout JavaOne. Instead, he received his award at Oracle Day 2012 Nairobi, some days ago, where he presented Level One and received the Duke's Choice Award: Level One is the UNHCR (UN refugee agency) application for capturing information on the first level details of refugees in an emergency context. In its recently released initial version, the application was used in Niger to register information about families in emergency contexts. Read more about it here and see the screenshot below. Congratulations, Stanyslas, and the rest of the development team working on this interesting and important project!

    Read the article

  • SOA PS5 Bundle Patch 4 and OSB PS5 Bundle Patch 1

    - by ShawnBailey
    Announcing the Availability of SOA Suite 11g PS5 Bundled Patch 4 and OSB PS5 Bundle Patch 1 These Bundled Patches contain a number of high impact fixes for PS5 and are recommended for anyone currently using this release. Please review the list of included fixes in the readmes and if you are running with any SOA or OSB patches not included in the Bundled Patches please request for Support to create a one-off on top of the appropriate Bundled Patch. The patches can be downloaded from My Oracle Support. 'Patches & Updates' - Enter '14406487' (SOA) or '14389126' (OSB) and click 'Search'. Further information on specific included fixes can also be found in the following documents on MOS: SOA 11g: Bundle Patch Reference, Doc ID 1485949.1 OSB 11g: Bundle Patch Reference, Doc ID 1499170.1

    Read the article

  • « Magento, une intégration qui s'annonce difficile », le Directeur Général de PrestaShop réagit au rachat de la plateforme par eBay

    « Magento, une intégration qui s'annonce difficile » Le Directeur Général de PrestaShop réagit au rachat de la plateforme par eBay Christophe Cremer, Directeur Général de PrestaShop, vient de réagir au récent rachat de Magento par eBay dans une tribune libre qu'il nous a fait parvenir. Pour lui, eBay a « fait une bonne affaire financière en s'intéressant tôt à Magento, un logiciel e-commerce qui est incontestablement une référence dans le domaine » . Le dirigeant tire d'ailleurs son chapeau au top management du site d'enchère habitué aux coups stratégiques. Le rachat de Paypal en 2002 pour 1,5 Milliard de dol...

    Read the article

  • Use Drive Mirroring for Instant Backup in Windows 7

    - by Trevor Bekolay
    Even with the best backup solution, a hard drive crash means you’ll lose a few hours of work. By enabling drive mirroring in Windows 7, you’ll always have an up-to-date copy of your data. Windows 7’s mirroring – which is only available in Professional, Enterprise, and Ultimate editions – is a software implementation of RAID 1, which means that two or more disks are holding the exact same data. The files are constantly kept in sync, so that if one of the disks fails, you won’t lose any data. Note that mirroring is not technically a backup solution, because if you accidentally delete a file, it’s gone from both hard disks (though you may be able to recover the file). As an additional caveat, having mirrored disks requires changing them to “dynamic disks,” which can only be read within modern versions of Windows (you may have problems working with a dynamic disk in other operating systems or in older versions of Windows). See this Wikipedia page for more information. You will need at least one empty disk to set up disk mirroring. We’ll show you how to mirror an existing disk (of equal or lesser size) without losing any data on the mirrored drive, and how to set up two empty disks as mirrored copies from the get-go. Mirroring an Existing Drive Click on the start button and type partitions in the search box. Click on the Create and format hard disk partitions entry that shows up. Alternatively, if you’ve disabled the search box, press Win+R to open the Run window and type in: diskmgmt.msc The Disk Management window will appear. We’ve got a small disk, labeled OldData, that we want to mirror in a second disk of the same size. Note: The disk that you will use to mirror the existing disk must be unallocated. If it is not, then right-click on it and select Delete Volume… to mark it as unallocated. This will destroy any data on that drive. Right-click on the existing disk that you want to mirror. Select Add Mirror…. Select the disk that you want to use to mirror the existing disk’s data and press Add Mirror. You will be warned that this process will change the existing disk from basic to dynamic. Note that this process will not delete any data on the disk! The new disk will be marked as a mirror, and it will starting copying data from the existing drive to the new one. Eventually the drives will be synced up (it can take a while), and any data added to the E: drive will exist on both physical hard drives. Setting Up Two New Drives as Mirrored If you have two new equal-sized drives, you can format them to be mirrored copies of each other from the get-go. Open the Disk Management window as described above. Make sure that the drives are unallocated. If they’re not, and you don’t need the data on either of them, right-click and select Delete volume…. Right-click on one of the unallocated drives and select New Mirrored Volume…. A wizard will pop up. Click Next. Click on the drives you want to hold the mirrored data and click Add. Note that you can add any number of drives. Click Next. Assign it a drive letter that makes sense, and then click Next. You’re limited to using the NTFS file system for mirrored drives, so enter a volume label, enable compression if you want, and then click Next. Click Finish to start formatting the drives. You will be warned that the new drives will be converted to dynamic disks. And that’s it! You now have two mirrored drives. Any files added to E: will reside on both physical disks, in case something happens to one of them. Conclusion While the switch from basic to dynamic disks can be a problem for people who dual-boot into another operating system, setting up drive mirroring is an easy way to make sure that your data can be recovered in case of a hard drive crash. Of course, even with drive mirroring, we advocate regular backups to external drives or online backup services. Similar Articles Productive Geek Tips Rebit Backup Software [Review]Disabling Instant Search in Outlook 2007Restore Files from Backups on Windows Home ServerSecond Copy 7 [Review]Backup Windows Home Server Folders to an External Hard Drive TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 VMware Workstation 7 Acronis Online Backup Windows Firewall with Advanced Security – How To Guides Sculptris 1.0, 3D Drawing app AceStock, a Tiny Desktop Quote Monitor Gmail Button Addon (Firefox) Hyperwords addon (Firefox) Backup Outlook 2010

    Read the article

  • Azure Storage Explorer

    - by kaleidoscope
    Azure Storage Explorer –  an another way to Deploy the services on Cloud Azure Storage Explorer is a useful GUI tool for inspecting and altering the data in your Azure cloud storage projects including the logs of your cloud-hosted applications. All three types of cloud storage can be viewed: blobs, queues, and tables. You can also create or delete blob/queue/table containers and items. Text blobs can be edited and all data types can be imported/exported between the cloud and local files. Table records can be imported/exported between the cloud and spreadsheet CSV files. Why Azure Storage Explorer Azure Storage Explorer is a licensed CodePlex project provided by Neudesic – a Microsoft partner.  It is a simple UI that requires you to input your blob storage name, access key and endpoints in the Storage Settings dialog. For more details please refer to the link: http://azurestorageexplorer.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=35189   Anish, S

    Read the article

  • Announcing the RTM of MvcExtensions (aka System.Web.Mvc.Extensibility)

    - by kazimanzurrashid
    I am proud to announce the v1.0 of MvcExtensions (previously known as System.Web.Extensibility). There has been quite a few changes and enhancements since the last release. Some of the major changes are: The Namespace has been changed to MvcExtensions from System.Web.Mvc.Extensibility to avoid the unnecessary confusion that it is in the .NET Framework or part of the ASP.NET MVC. The Project is now moved to CodePlex from the GitHub. The primary reason to start the project over GitHub was distributed version control which is no longer valid as CodePlex recently added the Mercurial support. There is nothing wrong with GitHub, it is an excellent place for managing your project. But CodePlex has always been the native place for .NET project. MVC 1.0 support has been dropped. I will be covering each features in my blog, so stay tuned!!!

    Read the article

  • Cloud Computing Architecture Patterns: Don’t Focus on the Client

    - by BuckWoody
    Normally I try to put topics in the positive in other words "Do this" not "Don't do that". Sometimes its clearer to focus on what *not* to do. Popular development processes often start with screen mockups, or user input descriptions. In a scale-out pattern like Cloud Computing on Windows Azure, that's the wrong place to start. Start with the Data    Instead, I recommend that you start with the data that a process requires. That data might be temporary or persisted, but starting with the data and its requirements helps to define not only the storage engine you need but also drives everything from security to the integrity of the application. For instance, assume the requirements show that the user must enter their phone number, and that this datum is used in a contact management system further down the application chain. For that datum, you can determine what data type you need (U.S. only or International?) the security requirements, whether it needs ACID compliance, how it will be searched, indexed and so on. From one small data point you can extrapolate out your options for storing and processing the data. Here's the interesting part, which begins to break the patterns that we've used for decades: all of the data doesn't have the same requirements. The phone number might be best suited for a list, or an element, or a string, with either BASE or ACID requirements, based on how it is used. That means we don't have to dump everything into XML, an RDBMS, a NoSQL engine, or a flat file exclusively. In fact, one record might use all of those depending on the use-case requirements. Next Is Data Management  With the data defined, we can move on to how to store the data. Again, the requirements now dictate whether we need a full relational calculus or set-based operations, or we can choose another method based on the requirements for the data. And breaking another pattern its OK to store in more than once, in more than one location. We do this all the time for reporting systems and Business Intelligence systems, so this is a pattern we need to think about even for OLTP data. Move to Data Transport How does the data get around? We can use a connection-based method, sending the data along a transport to the storage engine, but in some cases we may want to use a cache, a queue, the Service Bus, or Complex Event Processing. Finally, Data Processing Most RDBMS engines, NoSQL, and certainly Big Data engines not only store data, but can process and manipulate it as well. Its doubtful that you'll calculate that phone number right? Well, if you're the phone company, you most certainly will. And so we see that even once we've chosen the data type, storage and engine, the same element can have different computing requirements based on how it is used. Sure, We Need A Front-End At Some Point Not all data is entered by human hands in fact most data isn't. We don't really need a Graphical User Interface (GUI) we need some way for a GUI to get data into and out of the systems listed earlier.   But when we do need to allow users to enter or examine data, that should be left to the GUI that best fits the device the user has. Ever tried to use an application designed for a web browser on a phone? Or one designed for a tablet on a phone? Its usually quite painful. The siren song of "We'll just write one interface for all devices" is strong, and has beguiled many an unsuspecting architect. But they just don't work out.   Instead, focus on the data, its transport and processing. Create API calls or a message system that allows for resilient transport to the device or interface, and let it do what it does best. References Microsoft Architecture Journal:   http://msdn.microsoft.com/en-us/architecture/bb410935.aspx Patterns and Practices:   http://msdn.microsoft.com/en-us/library/ff921345.aspx Windows Azure iOS, Android, Windows 8 Mobile Devices SDK: http://www.windowsazure.com/en-us/develop/mobile/tutorials/get-started-ios/ Windows Azure Facebook SDK: http://ntotten.com/2013/03/14/using-windows-azure-mobile-services-with-the-facebook-sdk-for-windows-phone/

    Read the article

  • DBCC CHECKDB (BatmanDb, REPAIR_ALLOW_DATA_LOSS) &ndash; Are you Feeling Lucky?

    - by David Totzke
    I’m currently working for a client on a PowerBuilder to WPF migration.  It’s one of those “I could tell you, but I’d have to kill you” kind of clients and the quick-lime pits are currently occupied by the EMC tech…but I’ve said too much already. At approximately 3 or 4 pm that day users of the Batman[1] application here in Gotham[1] started to experience problems accessing the application.  Batman[2] is a document management system here that also integrates with the ERP system.  Very little goes on here that doesn’t involve Batman in some way.  The errors being received seemed to point to network issues (TCP protocol error, connection forcibly closed by the remote host etc…) but the real issue was much more insidious. Connecting to the database via SSMS and performing selects on certain tables underlying the application areas that were having problems started to reveal the issue.  You couldn’t do a SELECT * FROM MyTable without it bombing and giving the same error noted above.  A run of DBCC CHECKDB revealed 14 tables with corruption.  One of the tables with issues was the Document table.  Pretty central to a “document management” system.  Information was obtained from IT that a single drive in the SAN went bad in the night.  A new drive was in place and was working fine.  The partition that held the Batman database is configured for RAID Level 5 so a single drive failure shouldn’t have caused any trouble and yet, the database is corrupted.  They do hourly incremental backups here so the first thing done was to try a restore.  A restore of the most recent backup failed so they worked backwards until they hit a good point.  This successful restore was for a backup at 3AM – a full day behind.  This time also roughly corresponds with the time the SAN started to report the drive failure.  The plot thickens… I got my hands on the output from DBCC CHECKDB and noticed a pattern.  What’s sad is that nobody that should have noticed the pattern in the DBCC output did notice.  There was a rush to do things to try and recover the data before anybody really understood what was wrong with it in the first place.  Cooler heads must prevail in these circumstances and some investigation should be done and a plan of action laid out or you could end up making things worse[3].  DBCC CHECKDB also told us that: repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKDB Yikes.  That means that the database is so messed up that you’re definitely going to lose some stuff when you repair it to get it back to a consistent state.  All the more reason to do a little more investigation into the problem.  Rescuing this database is preferable to having to export all of the data possible from this database into a new one.  This is a fifteen year old application with about seven hundred tables.  There are TRIGGERS everywhere not to mention the referential integrity constraints to deal with.  Only fourteen of the tables have an issue.  We have a good backup that is missing the last 24 hours of business which means we could have a “do-over” of yesterday but that’s not a very palatable option either. All of the affected tables had TEXT columns and all of the errors were about LOB data types and orphaned off-row data which basically means TEXT, IMAGE or NTEXT columns.  If we did a SELECT on an affected table and excluded those columns, we got all of the rows.  We exported that data into a separate database.  Things are looking up.  Working on a copy of the production database we then ran DBCC CHECKDB with REPAIR_ALLOW_DATA_LOSS and that “fixed” everything up.   The allow data loss option will delete the bad rows.  This isn’t too horrible as we have all of those rows minus the text fields from out earlier export.  Now I could LEFT JOIN to the exported data to find the missing rows and INSERT them minus the TEXT column data. We had the restored data from the good 3AM backup that we could now JOIN to and, with fingers crossed, recover the missing TEXT column information.  We got lucky in that all of the affected rows were old and in the end we didn’t lose anything.  :O  All of the row counts along the way worked out and it looks like we dodged a major bullet here. We’ve heard back from EMC and it turns out the SAN firmware that they were running here is apparently buggy.  This thing is only a couple of months old.  Grrr…. They dispatched a technician that night to come and update it .  That explains why RAID didn’t save us. All-in-all this could have been a lot worse.  Given the root cause here, they basically won the lottery in not losing anything. Here are a few links to some helpful posts on the SQL Server Engine blog.  I love the title of the first one: Which part of 'REPAIR_ALLOW_DATA_LOSS' isn't clear? CHECKDB (Part 8): Can repair fix everything? (in fact, read the whole series) Ta da! Emergency mode repair (we didn’t have to resort to this one thank goodness)   Dave Just because I can…   [1] Names have been changed to protect the guilty. [2] I'm Batman. [3] And if I'm the coolest head in the room, you've got even bigger problems...

    Read the article

  • OWB 11gR2: Migration and Upgrade Paths from Previous Versions

    - by antonio romero
    Over the next several months, we expect widespread adoption of OWB 11gR2, both for its new features and because it is the only release of Warehouse Builder certified for use with database 11gR2. Customers seeking to move existing environments to OWB 11gR2 should review the new whitepaper, OWB 11.2: Upgrade and Migration Paths. This whitepaper covers the following topics: The difference between upgrade and migration, and how to choose between them An outline of how to perform each process When and where intermediate upgrade steps are required Tips for upgrading an existing environment to 11gR2 without having to regenerate and redeploy code to your production environment. Moving up from 10gR2 and 11gR1 is generally straightforward. For customers still using OWB 9 or 10.1, it is generally possible to move an entire environment forward complete with design and runtime audit metadata, but the upgrade process can be complex and may require intermediate processing using OWB 10.2 or OWB 11.1. Moving a design by itself is much simpler, though it requires regeneration and redeployment. Relevant details are provided in the whitepaper, so if you are planning an upgrade at some point soon, definitely start there.

    Read the article

  • SQLAuthority News Free eBook Download Introducing Microsoft SQL Server 2008 R2

    Microsoft Press has published FREE eBook on the most awaiting release of SQL Server 2008 R2. The book is written by Ross Mistry and Stacia Misner. Ross is my personal friend and one of the most active book writer in SQL Server Domain. When I see his name on any book, I am sure that [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • GDD-BR 2010 [2E] Building Business Apps using Google Web Toolkit and Spring Roo

    GDD-BR 2010 [2E] Building Business Apps using Google Web Toolkit and Spring Roo Speaker: Chris Ramsdale Track: Cloud Computing Time: 14:40 - 15:25 Room: sala[2] Level: 201 Who says you can't build rich web apps for your business? Follow along in this session to learn how you can use the latest integrated set of tools from Google and VMware to take your internal business apps into the cloud. We'll cover how to get started using GWT with Spring Roo and SpringSource Tool Suite (STS), as well as the new data presentation widgets and MVP framework that will be available in the 2.1 release of GWT. From: GoogleDevelopers Views: 69 0 ratings Time: 45:56 More in Science & Technology

    Read the article

  • Microsoft Codename Houston

    - by kaleidoscope
    On one of the final talks about SQL Azure in Day 3 of PDC09, David Robinson, Senior PM on the Azure team, announced a project codenamed ‘Houston’ which is basically a Silverlight equivalent of SQL Server Management Studio. The concept comes from the SQL Azure being within the cloud, and if the only way to interact with it is by installing SSMS locally then it does not feel like a consistent story. From the limited preview, it only contains the basics but it clearly lets you create tables, stored procedures and views, edit them, even add data to tables in a grid view reminiscent of Microsoft Access. The UI was based around the standard ribbon bar, object window on the left and working pane on the right. As of now this tool is still pre-alpha and it seems like a basic tool that will facilitate rapid database development on cloud. When asked about general availability, no dates were given but calendar 2010 was indicated as the target. More information can be found at:      http://sqlfascination.com/2009/11/20/pdc-09-day-3-sql-azure-and-codename-houston-announcement/   Tinu, O

    Read the article

  • My app 'Howzzat Book–Windows 8 Metro App is in the store now

    - by nmarun
    I’m just so excited that my application Howzzat Book passed all certifications and is now in the Windows Store. Here’s the email from MIcrosoft that I received: “Your app is in the Windows Store! Congratulations! Howzzat Book, release 1 is now in the Windows Store. Use this link to your app’s listing in the Windows Store to let others know about your app.” Link for Howzzat Book Now, since they’ve just added it to the store, it might take some time to be available for download. So if you don’t find...(read more)

    Read the article

  • Oracle... and InfiniBand.

    - by jenny.gelhausen
    Beginning Sunday, 14th March 2010 the OpenFabrics Alliance has been hosting its annual conference in Sonoma, California. On Monday morning, Tim Shetler - VP of Product Management at Oracle - addressed a conference room full to the brim with the industry's InfiniBand luminaries. That same afternoon, Sumanta Chatterjee, Senior Director of Development at Oracle, was publicly lauded by moderator Bill Boas for being a long-standing, pivotal driver and crucial member of the community. A testament to InfiniBand's building momentum, it is no surprise to find it at the core of Oracle's flagship product, the Sun Oracle Database Machine. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • How to install new Intel Ethernet driver

    - by Alex Farber
    Ubuntu 12.04 x64 doesn't recognize newest Intel Ethernet adapter on my desktop (Intel Ethernet Connection i217-V). I downloaded required driver from Intel and compiled it using make. Now I have: alex@alex64-six:~$ find / -name 'e1000e.ko' 2>/dev/null /home/alex/Documents/IntelEthernetDriver/e1000e-3.0.4/src/e1000e.ko /lib/modules/3.2.0-64-generic/kernel/drivers/net/ethernet/intel/e1000e/e1000e.ko The first line is new driver compiled from Intel sources. The second line is probably existing driver from Ubuntu distribution, which doesn't recognize new Ethernet adapter. How can I apply the new driver instead of existing one? Any other solution is welcome. For now, I cannot upgrade to latest Ubuntu release, because I use some third-party products.

    Read the article

  • Unity Is The Swiss Army Knife of Game Console Mods

    - by Jason Fitzpatrick
    This expansive console modification blends over a dozen game systems into one unified console with a shared power source and controller. There are console mods and then there are builds like this. This impressive work in progress combines the hardware boards of multiple game systems into a single unified system that shares a single power source, video output, and controller. The attention to detail and outright gaming obsession and geekiness is definitely creeping to the top of the charts with this one. Hit up the link below to check out a detailed post about the build and see additional videos and photos. Bacteria’s Project Unity [via Hack A Day] HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online Here’s How to Download Windows 8 Release Preview Right Now

    Read the article

  • Customer Spotlight: Land O’Lakes

    - by kellsey.ruppel
    Land O’Lakes, Inc. is one of America’s premier member-owned cooperatives, offering local cooperatives and agricultural producers across the nation an extensive line of agricultural supplies, as well as state-of-the-art production and business services. WinField Solutions, a company within Land O’Lakes, is using Oracle WebCenter to improve online experiences for their customers, partners, and employees. The company’s more than 3,000 seed customers, and its more than 300 internal and external sales force members and business partners, use Oracle WebCenter to handle all aspects of account management and order entry through a consolidated, personalized, secure user interface. Learn more about Land O’Lakes and Oracle WebCenter by reading this interview with Barry Libenson, Land O’Lakes chief information officer, or by watching this video.

    Read the article

  • How often are comments used in XML documents?

    - by Jeffrey Sweeney
    I'm currently developing a web-based XML managing program for a client (though I may 'market' it for future clients). Currently, it reads an XML document, converts it into manageable Javascript objects, and ultimately spits out indented, easy to read XML code. Edit: The program would be used by clients that don't feel like learning XML to add items or tags, but I (or another XML developer) may use the raw data for quick changes without using an editor. I feel like fundamentally, its ready for release, but I'm wondering if I should go the extra mile and allow support for remembering (and perhaps making) comments before generating the resulting XML. Considering that these XML files will probably never be read without a program interpreting it, should I really bother adding support for comments? I'll probably be the only one looking at raw files, and I usually don't use comments for XML anyway. So, are comments common/important in most XML documents?

    Read the article

  • Wireless iwconfig rate auto too low

    - by Jamie Kitson
    Hi, left to its own devices my wireless connects at too low a speed. I have a 20meg internet connection and my wireless is slowing it down to like 3meg. When I reboot into windows it's fine. When I run iwconfig eth1 rate 24M or even 48M the connection is much faster and runs fine, why won't it automatically go higher? Is this the fault of the driver? I am running Broadcom's driver compiled from source. Would adding iwconfig eth1 rate 24M to rc.local be the right way to force it at boot? Output from iwconfig when rate=auto: eth1 IEEE 802.11 ESSID:"honeypot" Mode:Managed Frequency:2.417 GHz Access Point: xxx Bit Rate=1 Mb/s Tx-Power:24 dBm Retry min limit:7 RTS thr:off Fragment thr:off Encryption key:off Power Management:off Link Quality=5/5 Signal level=-47 dBm Noise level=-91 dBm Rx invalid nwid:0 Rx invalid crypt:2 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0 Thanks, Jamie

    Read the article

  • Can't extract .tar.xz archive on 13.10 because permission denied

    - by HOS
    I used to work with Ubuntu 13.04 and also i have installed Vlc 2.1.0 with a .tar.xz archive on that , but after release of 13.10 , i erased 13.04 and installed 13.10 . so i tried to install vlc 2.1.0 with the normal PPA (sudo add-apt-repository ppa:videolan/stable-daily sudo apt-get update sudo apt-get install vlc browser-plugin-vlc ) way but it installed vlc 2.0.9 for me , so i 'm remove that and tried to install with the way i have installed it before on 13.04 (wget -c download.videolan.org/pub/videolan/vlc/2.1.0/vlc-2.1.0.tar.xz tar -xJvf download.videolan.org/pub/videolan/vlc/2.1.0/vlc-2.1.0.tar.xz cd vlc-2.1.0 sudo apt-get build-dep vlc ./configure make sudo make install) , but suddenly an error disturbed me in extracting the .tar.xz file : "Error setting owner : Operation not permitted" , but i the owner of file is me and i changed the all owner settings in file properties to read and write but it don't works ,so what can i do ? and also if it don't work , suggest me a good way to install Vlc 2.1.0 ! (Thanks - sorry for my bad English)

    Read the article

  • Evernote for Android Updates with New Features and Updated Widget

    - by Jason Fitzpatrick
    Android: Evernote for Android now features enhanced sharing, tighter Skitch integration, and a brand new homescreen widget. With this update you can now share entire notebooks directly from your Android phone, edit and annotate images with Skitch, and use the Evernote widget regardless of where you have Evernote installed–the previous version of Evernote’s widget would only function if Evernote was installed on the main memory instead of the SD card. You can read more about the new release here or hit up the link below to grab a copy from the Android Market. Evernote [Android Market] How to Make the Kindle Fire Silk Browser *Actually* Fast! Amazon’s New Kindle Fire Tablet: the How-To Geek Review HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS

    Read the article

  • Oracle OpenWorld: Oracle WebCenter Customer Appreciation Reception

    - by kellsey.ruppel
    Oracle WebCenter Customer Appreciation Reception Oracle WebCenter partners Fishbowl Solutions, Fujitsu, Keste, Mythics, Redstone Content Solutions, Team Informatics & TekStream invite you to a private cocktail reception at one of San Francisco's finest hotels. Please join us and other Oracle WebCenter customers for heavy hors d'oeuvres and cocktails at this exclusive reception. Tuesday, October 2, 2012 6:30 p.m. – 9:30 p.m. The Palace Hotel Ralston Ballroom 2 New Montgomery Street San Francisco, CA 94105 Don't miss the opportunity to meet and talk with executives from Oracle WebCenter Product Management, Product Marketing and Oracle's premier WebCenter partners. We look forward to seeing you at this event! RSVP Now Please RSVP to http://www.surveymonkey.com/s/OOW12 by September 26, 2012. You will receive an email notification from [email protected] confirming your attendance for this event. Sponsored by:

    Read the article

  • Chess as a team building exercise for software developers

    - by maple_shaft
    The last place I worked wasn't a particularly great place and there were more than a few nights where we were working late into the evening trying to meet our sprints. The team while stressed out got pretty close and people started bringing in little mind teasers and puzzles, just something we would all play around with and try to solve while a build/deploy was running for the test environment, or while we were waiting for the integration test run to finish. Eventually it turned into people bringing chess boards in and setting them at their desks. We would play by email sending each other moves in chess notation, but at a very casual pace, with games lasting sometimes two or three days. Management tolerated this when we were putting in overtime, but as things were being managed better and people weren't working much more than 40/wk, they started cracking down on this and told us that we weren't allowed to have chess boards at our desks, although they were okay with the puzzle games. What are the pros and cons in your opinion of allowing chess during software development lull time?

    Read the article

< Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >