Search Results

Search found 4667 results on 187 pages for 'john conrad'.

Page 26/187 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Silverlight Cream for April 27, 2010 -- #849

    - by Dave Campbell
    In this Issue: Mike Snow, Kunal Chowdhury, Giorgetti Alessandro, Alexander Strauss, Corey Schuman, Kirupa, John Papa, Miro Miroslavov, Michael Washington, and Jeremy Likness. Shoutouts: Erik Mork and crew have posted their latest This Week In Silverlight April 23 2010 The Silverlight Team announced Microsoft releases Silverlight-powered Windows Intune beta Jesse Liberty has posted his UK and Ireland Slides and Links The Expression Blend and Design Blog reports a Minor Update to The Expression Blend 4 Release Candidate From SilverlightCream.com: Silverlight Tip of the Day #6 – Toast Notifications Mike Snow has Tip #6 up today and it's about Toast notifications in OOB apps: Restrictions, creation, showing, and the code. Silverlight Tutorials Chapter 2: Introduction to Silverlight Application Development Part 2 of Kunal Chowdhury's Introductory tutorial set is up ... he's covering how to create a Silverlight project, what's contained in it, and creating a User Control. Silverlight, M-V-VM ... and IoC - part 3 Giorgetti Alessandro has part 3 of his Silverlight, IOC, and MVVM series up... this one with an example using the code discussed previously. The project is on CodePlex, and he's not done with the series. Application Partitioning with MEF, Silverlight and Windows Azure – Part I Alexander Strauss is discussing Silverlight and MEF for loosely-coupled and partitioned apps. He's also using Azure in this discussion. geekSpeak Recording - Five Key Developer Features in Expression Blend with Corey Schuman Check out the latest geekSpeak on Channel 9 where Corey Schuman talks about the 5 key Developer Features in Expression Blend that will improve your productivity. Using the ChangePropertyAction Kirupa is discussing and demonstrating ChangePropertyAction. Check out the demo near the top of the post, then read how to do it, and download the source. 3 Free Silverlight Demos John Papa blogged about the 2 demos (with source) that have been updated to SL4, and a new one all by Microsoft Luminaries Karen Corby, Adam Kinney, Mark Rideout, Jesse Bishop, and John Papa: "ScrapBook", "HTML and Video Puzzle", and "Rich Notepad". Floating Visual Elements I like Miro Miroslavov's comment: "every Silverlight application “must” have some objects floating around in a quite 3D manner" :) ... well they do that on the CompletIT site, and this is part 2 of their explanation of how all that goodness works. MVVM – A Total Design Change Of Your Application With No Code With some Blend goodness, Michael Washington completely reorganizes the UI of an MVVM application without touching any code ... project included MVVM with Transaction and View Locator Example Jeremy Likness responded to reader requests and has an example up, with explanation, of marrying his last two posts: transactions with MVVM and View Model Locator. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • WebCenter .NET Accelerator - Microsoft SharePoint Data via WSRP

    - by john.brunswick
    Platforms in the enterprise will never be homogeneous. As much as any vendor would enjoy having their single development or application technology be exclusively adopted by customers, too much legacy, time, education, innovation and vertical business needs exist to make using a single platform practical. JAVA and .NET are the two industry application platform heavyweights and more often than not, business users are leveraging various systems in their day to day activities that incorporate applications developed on top of both platforms. BEA Systems acquired Plumtree Software to complete their "liquid" view of data, stressing that regardless of a particular source system heterogeneous data could interoperate at not only through layers that allowed for data aggregation, but also at the "glass" or UI layer. The technical components that allowed the integration at the glass thrive today at Oracle, helping WebCenter to provide a rich composite application framework. Oracle Ensemble and the Oracle .NET Application Accelerator allow WebCenter to consume and interact with the UI layers provided by .NET applications and a series of other technologies. The beauty of the .NET accelerator is that it can consume any .NET application and act as a Web Services for Remote Portlets (WSRP) producer. I recently had a chance to leverage the .NET accelerator to expose a ASP .NET 2.0 (C#) application in the WebCenter UI (pictured above) and wanted to share a few tips to help others get started with similar integrations. I was using two virtual machines for the exercise - one with Windows Server 2003, running SharePoint and the other running WebCenter Spaces 11g. For my sample application data I ended up using SharePoint 2007 lists and calendars (MOSS 2007) to supply results using a .NET API for SharePoint.

    Read the article

  • Exciting New Job Announcement!

    - by John Blumenauer
    I’m extremely excited to announce that I’ve accepted a position with Applied Information Sciences (AIS).  The innovative work and company values at AIS are aligned with what I was seeking in my next position.  Also, over the past year or so, I’ve met some really talented individuals who work at AIS, so when the opportunity presented itself, I decided it was time to make a change. I look forward to the challenges ahead and working with a team of highly talented and motivated individuals.

    Read the article

  • O the Agony - Merging Scrum and Waterfall

    - by John K. Hines
    If there's nothing else to know about Scrum (and Agile in general), it's this: You can't force a team to adopt Agile methods.  In all cases, the team must want to change. Well, sure, you could force a team.  But it's going to be a horrible, painful process with a huge learning curve made even steeper by the lack of training and motivation on behalf of the team.  On a completely unrelated note, I've spent the past three months working on a team that was formed by merging three separate teams.  One of these teams has been adopting and using Agile practices like Scrum since 2007, the other was in continuous bug fix mode, releasing on average one new piece of software per year using semi-Waterfall methods.  In particular, one senior developer on the Waterfall team didn't see anything in Agile but overhead. Fast forward through three months of tension, passive resistance, process pushback, and you have seven people who want to change and one who explicitly doesn't.  It took two things to make Scrum happen: The team manager took a class called "Agile Software Development using Scrum". The team lead explained the point of Agile was to reduce the workload of the senior developer, with another senior developer and the manager present. It's incredible to me how a single person can strongly influence the direction of an entire team.  Let alone if Scrum comes down as some managerial decree onto a functioning team who have no idea what it is.  Pity the fool. On the bright side, I am now an expert at drawing Visio process flows.  And I have some gentle advice for any first-level managers: If you preside over a team process change, it's beneficial to start the discussion on how the team will work as early as possible.  You should have a vision for this and guide the discussion, even if decisions are weeks away.  Don't always root for the underdog.  It's been my experience that managers who see themselves as compassionate and caring spend a great deal of time understanding and advocating for the one person on the team who feels left out.  Remember that by focusing on this one person you risk alienating the rest of the team, allow tension to build, and delay the resolution of the problem. My way would have been to decree Scrum, force all of my processes on everyone else, and use the past three months ironing out the kinks.  Which takes us all the way back to point number one. Technorati tags: Scrum Scrum Process Scrum and Waterfall

    Read the article

  • Troubleshooting Microsoft Message Queuing Issues on Microsoft Lync Server 2010

    - by John Breakwell
    This blog post sounds specific but most of the troubleshooting tips can be applied to other scenarios: Troubleshooting Microsoft Message Queuing Issues on Microsoft Lync Server 2010 Microsoft Message Queuing (MSMQ) plays an important role in the Microsoft Lync Server 2010 Monitoring/Archiving server infrastructure: in a distributed network environment, MSMQ is used to transmit data from agents located on other servers (such as Front End Servers) to Monitoring/Archiving servers. The purpose of this article is to help you discover the root cause of any MSMQ problems that you might encounter, and to provide suggested ways to fix those problems. Microsoft Lync Server is the new name for Microsoft Office Communications Server. It’s good to see a major product make use of MSMQ – there aren’t many in the public eye (Symantec’s Enterprise Vault comes to mind).

    Read the article

  • Can I use Ubuntu One to sync data fiies between two remote computers

    - by Sleepy John
    I've got two computers, both running Ubuntu with files in their home folders sync'd in to Ubuntu One. I'd like to know if it's possible to make Ubuntu One automatically download data changes that have been uploaded automatically to Ubuntu One from one computer to the equivalent data file in the other. Clarifying a bit further, I've installed Red Notebook in both computers and so they each have their own /.rednotebook/data folder containing a series of .txt files corresponding to the monthly entries in each of them. These are sync'd to upload any changes to those .txt files to Ubuntu One. My question is can I, and if so how, do I make Ubuntu One automatically download and replace those .txt files in the other computer after they've been updated and uploaded from the first computer? I did labouriously manage to download all those text files which had been uploaded from the first computer, from Ubuntu One one-by-one to the second computer, but what I want to do is automate this process and that's where I'm stuck. I'm aware that things could get a bit complicated if both my computers were on-line at the same time and both were simultaneously making different Red Notebook entries, so that's not the scenario I'm trying to cover. All I want to achieve is that whatever updates to the files have been uploaded by one computer, will automatically be downloaded to the same-named files in the other computer as soon as that second computer appears on line and detects that Ubuntu One has matching but more recent sync'd files than the ones it's holding.

    Read the article

  • Making meetings much more efficient

    - by John Paul Cook
    Water. Yes, it’s that simple. There needs to be a rule that whoever convenes a meeting be required to drink water. Half a liter of water when the meeting begins and another half a liter every half hour thereafter. That simple rule will motivate the meeting organizer to find closure quickly – and the exit. What day did I post this? Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • How do I keep my Huawei E3131 3G modem from unmounting?

    - by John Perez
    I need help getting my Huawei E3131 modem to (consistently) work. I am currently running Ubuntu 12.04.2 and 3G dongles have worked many times before. However, my newly acquired Huawei E3131 is causing problems. When plugged in, Ubuntu detects the device as a CD-ROM and a modem. I can browse the dongle's contents using Nautilus and Network Manager is able to configure and work the dongle out-of-the-box. I can even get to surf. However, within minutes, the connection drops and the CD-ROM is unmounted. I wait about 15 seconds, then the CD-ROM mounts again and Network Manager is able to connect again with short-lived surfing. Rinse, lather and repeat. It's strange that the device mounts as a CD-ROM, but works as a modem, too, suggesting that mode switching happens somewhere. It's not a signal coverage problem, either because I tested the same SIM card using 3 other different dongles (2 ZTEs and another Huawei) and it is only this E3131 that has this problem. If pertinent, the other dongles weren't being detected as CD-ROMs. Output of lsusb Bus 001 Device 002: ID 8087:0020 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 002: ID 8087:0020 Intel Corp. Integrated Rate Matching Hub Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 003: ID 0ac8:c342 Z-Star Microelectronics Corp. Bus 001 Device 010: ID 12d1:1506 Huawei Technologies Co., Ltd. E398 LTE/UMTS/GSM Modem/Networkcard Bus 001 Device 004: ID 0a5c:219b Broadcom Corp. Bluetooth 2.1 Device I tried installing the drivers found here, but no fish. Furthermore, my device is an E3131, but lsusb for some reason detects it as an E398. I'm not sure how that plays a role for this problem, though. I hope someone out there can help me. I'm out of ideas already. Thank you very much!

    Read the article

  • Ubuntu 12.04 Not Recognising Arduino Uno

    - by John Fish
    I recently purchased an arduino uno from sparkfun, and attempted it with a fresh ubuntu 12.04 build. I installed the ide, and it seemed to work fine; I could write the sketches. However, when I plugged the arduino in, it appeared not to recognise it. I should also note that the arduino does draw power from the USB port, as the green "On" light is illuminated. The serial port arduino uno should be "assigned to" is /dev/ttyACM0. However, this did not exist, as upon running ls dev/tty*, I got the following results: /dev/tty /dev/tty23 /dev/tty39 /dev/tty54 /dev/ttyS10 /dev/ttyS26 /dev/tty0 /dev/tty24 /dev/tty4 /dev/tty55 /dev/ttyS11 /dev/ttyS27 /dev/tty1 /dev/tty25 /dev/tty40 /dev/tty56 /dev/ttyS12 /dev/ttyS28 /dev/tty10 /dev/tty26 /dev/tty41 /dev/tty57 /dev/ttyS13 /dev/ttyS29 /dev/tty11 /dev/tty27 /dev/tty42 /dev/tty58 /dev/ttyS14 /dev/ttyS3 /dev/tty12 /dev/tty28 /dev/tty43 /dev/tty59 /dev/ttyS15 /dev/ttyS30 /dev/tty13 /dev/tty29 /dev/tty44 /dev/tty6 /dev/ttyS16 /dev/ttyS31 /dev/tty14 /dev/tty3 /dev/tty45 /dev/tty60 /dev/ttyS17 /dev/ttyS4 /dev/tty15 /dev/tty30 /dev/tty46 /dev/tty61 /dev/ttyS18 /dev/ttyS5 /dev/tty16 /dev/tty31 /dev/tty47 /dev/tty62 /dev/ttyS19 /dev/ttyS6 /dev/tty17 /dev/tty32 /dev/tty48 /dev/tty63 /dev/ttyS2 /dev/ttyS7 /dev/tty18 /dev/tty33 /dev/tty49 /dev/tty7 /dev/ttyS20 /dev/ttyS8 /dev/tty19 /dev/tty34 /dev/tty5 /dev/tty8 /dev/ttyS21 /dev/ttyS9 /dev/tty2 /dev/tty35 /dev/tty50 /dev/tty9 /dev/ttyS22 /dev/tty20 /dev/tty36 /dev/tty51 /dev/ttyprintk /dev/ttyS23 /dev/tty21 /dev/tty37 /dev/tty52 /dev/ttyS0 /dev/ttyS24 /dev/tty22 /dev/tty38 /dev/tty53 /dev/ttyS1 /dev/ttyS25 Any help would be greatly appreciated.

    Read the article

  • Idea to develop a caching server between IIS and SQL Server

    - by John
    I work on a few high traffic websites that all share the same database and that are all heavily database driven. Our SQL server is max-ed out and, although we have already implemented many changes that have helped but the server is still working too hard. We employ some caching in our website but the type of queries we use negate using SQL dependency caching. We tried SQL replication to try and kind of load balance but that didn't prove very successful because the replication process is quite demanding on the servers too and it needed to be done frequently as it is important that data is up to date. We do use a Varnish web caching server (Linux based) to take a bit of the load off both the web and database server but as a lot of the sites are customised based on the user we can only do so much. Anyway, the reason for this question... Varnish gave me an idea for a possible application that might help in this situation. Just like Varnish sits between a web browser and the web server and caches response from the web server, I was wondering about the possibility of creating something that sits between the web server and the database server. Imagine that all SQL queries go through this SQL caching server. If it's a first time query then it will get recorded, and the result requested from the SQL server and stored locally on the cache server. If it's a repeat request within a set time then the result gets retrieved from the local copy without the query being sent to the SQL server. The caching server could also take advantage of SQL dependency caching notifications. This seems like a good idea in theory. There's still the same amount of data moving back and forward from the web server, but the SQL server is relieved of the work of processing the repeat queries. I wonder about how difficult it would be to build a service that sort of emulates requests and responses from SQL server, whether SQL server's own caching is doing enough of this already that this wouldn't be a benefit, or even if someone has done this before and I haven't found it? I would welcome any feedback or any references to any relevant projects.

    Read the article

  • Ubuntu 13.10 gives "Package 'libapache2-mod-auth-mysql' has no installation candidate" error

    - by John Crawford
    I'm trying to install my LAMP environment on Ubuntu 13.10 using my script file that can be found here. That script worked for Ubuntu 12.04, Ubuntu 13.04 but when I try it on Ubuntu 13.10 it gives the following error: E: Package 'libapache2-mod-auth-mysql' has no installation candidate Any idea on how to fix this? Note, I do want this package to be installed. EDIT: I've found out now that the reason this could not be installed was because it needed the following two packages that were missing: libmysqlclient16 and apache2.2-common. Do I just need to install these packages or were they removed for a reason?

    Read the article

  • css: zoooming-out inside the browser moves rightmost floated div below other divs

    - by John Sonderson
    I am seeing something strange in both firefox and chrome when I increase the zoom level inside these browsers, although I see nothing wrong with my CSS... I am hoping someone on this group will be able to help. Here is the whole story: I have a right-floated top-level div containing three right-floated right. The three inner divs have all box-model measurements in pixels which add up to the width of the enclosing container. Everything looks fine when the browser size is 100%, but when I start making the browser smaller with CTRL+scrollwheel or CTRL+minus the rightmost margin shrinks down too fast and eventually becomes zero, forcing my rightmost floated inner div to fall down below the other two! I can't make sense out of this, almost seems like some integer division is being performed incorrectly in the browser code, but alas firefox and chrome both display the same result. Here is the example (just zoom out with CTRL-minus to see what I mean): Click Here to View What I Mean on Example Site Just to narrow things down a bit, the tags of interest are the following: div#mainContent div#contentLeft div#contentCenter div#contentRight I've searched stackoverflow for an answer and found the following posts which seem related to my question but was not able to apply them to the problem I am experiencing: http:// stackoverflow.com/questions/6955313/div-moves-incorrectly-on-browser-resize http:// stackoverflow.com/questions/18246882/divs-move-when-resizing-page http:// stackoverflow.com/questions/17637231/moving-an-image-when-browser-resizes http:// stackoverflow.com/questions/5316380/how-to-stop-divs-moving-when-the-browser-is-resized I've duplicated the html and css code below for your convenience: Here is the HTML: <!doctype html> <html> <head> <meta charset="utf-8"> <title>Pinco</title> <link href="css/style.css" rel="stylesheet" type="text/css"> </head> <body> <div id="wrapper"> <header> <div class="logo"> <a href="http://pinco.com"> <img class="logo" src="images/PincoLogo5.png" alt="Pinco" /> </a> </div> <div class="titolo"> <h1>Benvenuti!</h1> <h2>Siete arrivati al sito pinco.</h2> </div> <nav> <ul class="menu"> <li><a href="#">Menù Qui</a></li> <li><a href="#">Menù Quo</a></li> <li><a href="#">Menù Qua</a></li> </ul> </nav> </header> <div id="mainContent"> <div id="contentLeft"> <section> <article> <p> Lorem ipsum dolor sit amet, consectetur adipiscing elit. Quisque tempor turpis est, nec varius est pharetra scelerisque. Sed eu pellentesque purus, at cursus nisi. In bibendum tristique nunc eu mattis. Nulla pretium tincidunt ipsum, non imperdiet metus tincidunt ac. In et lobortis elit, nec lobortis purus. Cras ac viverra risus. Proin dapibus tortor justo, a vulputate ipsum lacinia sed. In hac habitasse platea dictumst. Phasellus sit amet malesuada velit. Fusce diam neque, cursus id dui ac, blandit vehicula tortor. Phasellus interdum ipsum eu leo condimentum, in dignissim erat tincidunt. Ut fermentum consectetur tellus, dignissim volutpat orci suscipit ac. Praesent scelerisque urna metus. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Duis pulvinar, sem a sodales eleifend, odio elit blandit risus, a dapibus ligula orci non augue. Nullam vitae cursus tortor, eget malesuada lectus. Nulla facilisi. Cras pharetra nisi sit amet orci dignissim, a eleifend odio hendrerit. </p> </article> </section> </div> <div id="contentCenter"> <section> <article> <p> Maecenas vitae purus at orci euismod pretium. Nam gravida gravida bibendum. Donec nec dolor vel magna consequat laoreet in a urna. Phasellus cursus ultrices lorem ut sagittis. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Vivamus purus felis, ornare quis ante vel, commodo scelerisque tortor. Integer vel facilisis mauris. </p> <img src="images/auto1.jpg" width="272" height="272" /> <p> In urna purus, fringilla a urna a, ultrices convallis orci. Duis mattis sit amet leo sed luctus. Donec nec sem non nunc mattis semper quis vitae enim. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Suspendisse dictum porta quam, vel lobortis enim bibendum et. Donec iaculis tortor id metus interdum, hendrerit tincidunt orci tempor. Sed dignissim cursus mattis. </p> </article> </section> </div> <div id="contentRight"> <section> <article> <img src="images/auto2.jpg" width="272" height="272" /> <img src="images/auto3.jpg" width="272" height="272" /> <p> Cras eu quam lobortis, sodales felis ultricies, rhoncus neque. Aenean nisi eros, blandit ac lacus sit amet, vulputate sodales mi. Nunc eget purus ultricies, aliquam quam sit amet, porttitor velit. In imperdiet justo in quam tristique, eget semper nisi pellentesque. Cras fringilla eros enim, in euismod nisl imperdiet ac. Fusce tempor justo vitae faucibus luctus. </p> </article> </section> </div> </div> <footer> <div class="footerText"> <p> Copyright &copy; Pinco <br />Lorem ipsum dolor sit amet, consectetur adipiscing elit. <br />Fusce ornare turpis orci, nec egestas leo feugiat ac. <br />Morbi eget sem facilisis, laoreet erat ut, tristique odio. Proin sollicitudin quis nisi id consequat. </p> </div> <div class="footerLogo"> <img class="footerLogo" src="images/auto4.jpg" width="80" height="80" /> </div> </footer> </div> </body> </html> and here is the CSS: /* CSS Document */ * { margin: 0; border: 0; padding: 0; } body { background: #8B0000; /* darkred */; } body { margin: 0; border: 0; padding: 0; } div#wrapper { margin: 0 auto; width: 960px; height: 100%; background: #FFC0CB /* pink */; } header { position: relative; background: #005b97; height: 140px; } header div.logo { float: left; width: 360px; height: 140px; } header div.logo img.logo { width: 360px; height: 140px; } header div.titolo { float: left; padding: 12px 0 0 35px; color: black; } header div.titolo h1 { font-size: 36px; font-weight: bold; } header div.titolo h2 { font-size: 24px; font-style: italic; font-weight: bold; color: white;} header nav { position: absolute; right: 0; bottom: 0; } header ul.menu { background: black; } header ul.menu li { display: inline-block; padding: 3px 15px; font-weight: bold; } div#mainContent { float: left; width: 100%; /* width: 960px; *//* height: 860px; */ padding: 30px 0; text-align: justify; } div#mainContent img { margin: 12px 0; } div#contentLeft { height: 900px; float: left; margin-left: 12px; border: 1px solid black; padding: 15px; width: 272px; background: #ccc; } div#contentCenter { height: 900px; float: left; margin-left: 12px; border: 1px solid transparent; padding: 15px; width: 272px; background: #E00; } div#contentRight { height: 900px; float: left; margin-left: 12px; border: 1px solid black; padding: 15px; width: 272px; background: #ccc; } footer { clear: both; padding: 12px; background: #306; color: white; height: 80px; font-style: italic; font-weight: bold; } footer div.footerText { float: left; } footer div.footerLogo { float: right; } a { color: white; text-decoration: none; } Thanks.

    Read the article

  • Making meetings much more efficient

    - by John Paul Cook
    Water. Yes, that’s simple. There needs to be a rule that whoever convenes a meeting be required to drink water. Half a liter of water when the meeting begins and another half a liter every half hour thereafter. That simple rule will motivate the meeting organizer to find closure quickly – and the exit. What day did I post this? Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • Silverlight HVP on Silverlight TV

      The Silverlight HyperVideo Project has made two guest appearances on Silverlight TV.  In the first, I talk with John Papa about the project itself and how it has evolved. Then, during Mix, Tim Heuer and I sat down with John to discuss Whats New In Silverlight 4, and I managed to sneak in a few comments about the HVP as well. Silverlight TV has numerous great interviews, and is quickly becoming a valued asset throughout the Silverlight community.  Tens of thousands of...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Disk failing on dell mini, are there diagnostic tools?

    - by John Lawrence Aspden
    Hi, my dell mini 10v running Ubuntu 10.10 runs ok for hours, but then will suddenly slow down drastically. Switching to a console shows lots of error messages, which also get into /var/log/syslog. These errors happen every couple of seconds. I'm figuring that the disk is failing, but is there anyway to be sure, and can laptop disks be replaced easily? Mar 20 11:08:12 dell-mini kernel: [ 2476.378774] ata1.00: status: { DRDY ERR } Mar 20 11:08:12 dell-mini kernel: [ 2476.378785] ata1.00: error: { UNC } Mar 20 11:08:12 dell-mini kernel: [ 2476.449841] ata1.00: configured for UDMA/133 Mar 20 11:08:12 dell-mini kernel: [ 2476.449887] ata1: EH complete Mar 20 11:08:14 dell-mini kernel: [ 2478.777754] ata1.00: exception Emask 0x0 SAct 0x0 SErr 0x0 action 0x0 Mar 20 11:08:14 dell-mini kernel: [ 2478.777976] ata1.00: BMDMA stat 0x24 Mar 20 11:08:14 dell-mini kernel: [ 2478.778059] ata1.00: failed command: READ DMA Mar 20 11:08:14 dell-mini kernel: [ 2478.778162] ata1.00: cmd c8/00:08:43:3b:97/00:00:00:00:00/e0 tag 0 dma 4096 in Mar 20 11:08:14 dell-mini kernel: [ 2478.778166] res 51/40:00:47:3b:97/00:00:00:00:00/00 Emask 0x9 (media error)

    Read the article

  • XBRL US Conference Highlights

    - by john.orourke(at)oracle.com
    Back in early November I had an opportunity to attend the XBRL US National Conference in Philadelphia.  At the event, XBRL US announced that Oracle had joined the initiative, so I had a chance to participate in a press conference and attend a number of sessions.  Oracle joined XBRL US so we can stay ahead of the standard and leverage it in our products, and to help drive awareness with customers and improve adoption of XBRL. There were roughly 250 attendees at the event, about half of which were vendors and consultants and the rest financial reporting staff from corporate filers.  Event sponsors included Ernst & Young, SWIFT and Fujitsu.  There were also a number of XBRL technology and service providers exhibiting at the conference.  On Monday Nov. 8th, the XBRL US Steering Committee meetings and Annual Members meeting and reception were held.  At the Annual Members meeting the big news was that current XBRL US President, Mark Bolgiano, is moving to a new position at Howard Hughes Medical Center.  Campbell Pryde, who had led the Taxonomy Development for XBRL US, is taking over as XBRL US President. Other items that were highlighted at the members meeting included: The US GAAP XBRL taxonomy is being used by over 1500 SEC filers and has now been handed over to the FASB to maintain and enhance 16 filer training events were held in 2010 XBRL Global Magazine was launched Corporate Actions proposal was submitted to the SEC with SWIFT in May XBRL Labs for iPhone, XBRL US Consistency Suite launched ISO 2022 Corporate Actions Alignment with XBRL achieved The XBRL Credit Rating taxonomy was accepted Tuesday Nov. 9th included Keynotes, General Sessions, Innovation Workshop for Governments and Securities Professionals, and an Opening Reception.  General sessions included: Lessons Learned from the SEC's rollout of XBRL.  More than 18,000 errors were identified in reviews of filings between June 2009 and September 2010.  Most of these related to negative values being used where they shouldn't have.  Also, the SEC feels there are too many taxonomy extensions being created - mostly in the Cash Flow Statements.  They emphasize using existing elements in the US GAAP taxonomy and advise filers not to  create extensions to improve the visual formatting of XBRL filings. Investors and XBRL - Setting the Standard for Data Quality.  In this panel discussion, the key learning was that CFA's, academics and the financial community are not using XBRL as expected.  The issues raised include the  accuracy and completeness of filings, number of taxonomy extensions, and limited number of tools available to help analyze XBRL data.  Another big issue that was raised is the lack of historic results in XBRL - most analysts need 10 quarters of historic data.  On the positive side, XBRL has the potential to eliminate re-keying of data and errors here and can improve analytic capabilities for financial analysts once more historic data is available and more companies are providing detailed tagging of their filings. A US Roadmap for XBRL Financial Reporting.  This was a panel discussion featuring Jeff Neumann(SEC), Campbell Pryde(XBRL US), and Louis Matherne(FASB).  Key points included the fact that XBRL is currently used by 1500 companies, with 8000 more companies coming in 2011.  XBRL for Mutual Fund Reporting will start in 2011 for 8000 funds, and a Credit Rating Taxonomy has now been submitted for review.  The XBRL tagging/filing process is improving each quarter - more education is helping here.  The FASB is looking at extensions to date, and potential additions to US GAAP taxonomy, while the SEC is evaluating filings for accuracy, consistency in tagging, and tools for analyzing data.  The big news is that the FASB 2011 US GAAP Taxonomy has been completed and reviewed by SEC.  The 2011 US GAAP Taxonomy supports new FASB accounting standards issued since 2009, has new taxonomy elements for certain industries (i.e airlines) and the elimination of 500 concepts.  (meaning they can't be used going forward but are still supported for historical comparison)  The 2011 US GAAP Taxonomy will be available for usage with Q2 2011 SEC filings.  More information about this can be found on the FASB web site.  http://www.fasb.org/home Accounting Firms and XBRL.  This session covered the Role of Audit Firms, which includes awareness and education, validation of XBRL filings, and in-house transition planning.  The main advice provided was that organizations should document XBRL mapping process, perform peer comparisons, and risk assessments on a regular basis. Wednesday Nov. 10th included more Keynotes, General Sessions on Corporate Actions, and XBRL Essentials Workshop Training for corporate filers.  The XBRL Essentials Training included: Getting Started Once you Have the Basics Detailed Footnote Tagging and Handling Tables Quality Control and Trust in the XBRL Process Bringing XBRL In-House:  What are the Options, What should you consider? The US GAAP Financial Reporting Taxonomy - Overview of the 2011 release The XBRL Essentials Training was well-attended with about 80 people.  This included a good overview of the SEC's XBRL mandate, limited liability issue, tagging levels, recommended planning process, internal vs. outsourced approach, and how to manage service providers.  I learned a lot from the session on detailed tagging.  This is the requirement that kicks in during a company's second year of XBRL filing with the SEC and applies to financial statements, footnotes and disclosures (it does not apply to MD&A, executive communications and other information).  The review of the Linkbase model, or dimensional table structure, was very interesting and can be complex to understand.  The key takeaway here is that using dimensional tables in XBRL filings can help limit the number of taxonomy extensions that are required.  The slides from this session are posted on the XBRL US web site. (http://xbrl.us/events/Pages/archive.aspx) For me, the main summary points and takeaways from the XBRL US conference are: XBRL for financial reporting has turned the corner and gone mainstream - with 1500 companies currently using it and 8000 more coming in 2011 The expected value is not being achieved by filers or consumers of XBRL data - this will improve when more companies are filing in XBRL, more history is available, and more software tools are available for analysis (hmm, sounds like an opportunity for Oracle) XBRL is becoming the global standard for all business communications beyond just the financials - i.e. adoption for mutual funds, corporate actions and others planned for the future If you would like to learn more about XBRL and the various training programs, services and software tools that are available check out the XBRL US web site and even better - become a member.  Here's a link:  http://xbrl.us/Pages/default.aspx

    Read the article

  • Oracle E-Business Suite 12 Certified on Oracle Linux 6 (x86-64)

    - by John Abraham
    Oracle E-Business Suite Release 12 (12.1.1 and higher) is now certified on 64-bit Oracle Linux 6 with the Unbreakable Enterprise Kernel (UEK). New installations of the E-Business Suite R12 on this OS require version 12.1.1 or higher. Cloning of existing 12.1 Linux environments to this new OS is also certified using the standard Rapid Clone process. There are specific requirements to upgrade technology components such as the Oracle Database (to 11.2.0.3) and Fusion Middleware as necessary for use on Oracle Linux 6. These and other requirements are noted in the Installation and Upgrade Notes (IUN) below. Certification for other Linux distros still underway Certifications of Release 12 with 32-bit Oracle Linux 6, 32-bit and 64-bit Red Hat Enterprise Linux (RHEL) 6 and the Red Hat default kernel are in progress. References Oracle E-Business Suite Installation and Upgrade Notes Release 12 (12.1.1) for Linux x86-64 (My Oracle Support Document 761566.1) Cloning Oracle Applications Release 12 with Rapid Clone (My Oracle Support Document 406982.1) Interoperability Notes Oracle E-Business Suite Release 12 with Oracle Database 11g Release 2 (11.2.0) (My Oracle Support Document 1058763.1) Oracle Linux website

    Read the article

  • Podcast Show Notes: Collaborate 10 Wrap-Up - Conclusion

    - by Bob Rhubart
    Both parts of my conversation with a small army of people at Collaborate 10 are now available. Listen to Part 1 Listen to Part 2   Here’s the complete list of participants: Floyd Teter - Project Manager at Jet Propulsion Lab, OAUG Board Blog | Twitter | LinkedIn | Oracle Mix | Oracle ACE Profile Mark Rittman - EMEA Technical Director and Co-Founder, Rittman Mead,  ODTUG Board Blog | Twitter | LinkedIn | Oracle Mix | Oracle ACE Profile Chet Justice - OBI Consultant at BI Wizards Blog | Twitter | LinkedIn | Oracle Mix | Oracle ACE Profile Elke Phelps - Oracle Applications DBA at Humana, OAUG SIG Chair Blog | LinkedIn | Oracle Mix | Book | Oracle ACE Profile Paul Jackson - Oracle Applications DBA at Humana Blog | LinkedIn | Oracle Mix | Book Srini Chavali - Enterprise Database & Tools Leader at Cummins, Inc Blog | LinkedIn | Oracle Mix Dave Ferguson – President, Oracle Applications Users Group LinkedIn | OAUG Profile John King - Owner, King Training Resources Website | LinkedIn | Oracle Mix Gavyn Whyte - Project Portfolio Manager at iFactory Consulting Blog | Twitter | LinkedIn | Oracle Mix John Nicholson - Channels & Alliances at Greenlight Technologies Website | LinkedIn   del.icio.us Tags: oracle,otn,collborate 10,c10,oracle ace program,archbeat,arch2arch,oaug,odtug,las vegas Technorati Tags: oracle,otn,collborate 10,c10,oracle ace program,archbeat,arch2arch,oaug,odtug,las vegas

    Read the article

  • Oracle's PeopleSoft Customer Advisory Boards Convene to Discuss Roadmap at Pleasanton Campus

    - by john.webb(at)oracle.com
    Last week we hosted all of the PeopleSoft CABs (Customer Advisory Boards) at our Pleasanton Development Center to review our detailed designs for future Feature Packs, PeopleSoft 9.2, and beyond. Over 150 customers from 79 companies attended representing a variety of industries, geographies, and company sizes. The PeopleSoft team relies heavily on this group to provide key input on our roadmap for applications as well as technology direction. A good product strategy is one part well thought out idea with many handfuls of customer validation, and very often our best ideas originate from these customer discussions. While the individual CABs have frequent interactions with our teams, it's always great to have all of them in one place and in person. Our attendance was up from last year which I attribute to two things: (1) More interest as a result of PeopleSoft 9.1 upgrade; (2) An improving economy allowing for more travel. Maybe we should index the second item meeting-to-meeting and use it as a market indicator - we'll see! We kicked off the day one session with an overview of the PeopleSoft Roadmap and I outlined our strategy around Feature Packs and PeopleSoft 9.2. Given the high adoption rate of PeopleSoft 9.1 (over 4x that of 9.0 given the same time lapse since the release date), there was a lot of interest around the 9.1 Feature Packs as a vehicle for continuous value. We provided examples of our 3 central design themes: Simplicity, Productivity, and lower TCO, including those already delivered via Feature Packs in 2010. A great example of this is the Company Directory feature in PeopleSoft HCM. The configuration capabilities and the new actionable links our CAB advised us on last Spring were made available to all customers late last year. We reviewed many more future Navigation changes that will fundamentally change the way users interact with PeopleSoft. Our old friend, the menu tree, is being relegated from center stage to a bit part, with new concepts like Activity Guides, Train Stops, Related Actions, Work Centers, Collaborative Workspaces, and Secure Enterprise Search bringing users what they need in a contextual, role based manner with fewer clicks. Paco Aubrejuan, our PeopleSoft GM, and Steve Miranda, the SVP for Fusion Applications, then discussed our plans around Oracle's Application Investment Strategy.  This included our continued investment in developing both PeopleSoft and Fusion as well as the co-existence strategy with new Fusion Apps integrating to PeopleSoft Apps. Should you want to view this presentation, a recording is available. Jeff Robbins, our lead PeopleTools Strategist, provided the roadmap for PeopleTools and discussed our continuing plan to deliver annual releases to further evolve the user experience. Numerous examples were highlighted with the Navigation techniques I mentioned previously. Jeff also provided a lot of food for thought around Lifecycle Management topics and how to remain current on releases with a  lower cost of ownership. Dennis Mesler, from Boise, was the guest speaker in this slot, who spoke about the new PeopleSoft Test Framework (PTF). Regression Testing is a key cost component when product updates are applied. This new tool (which is free to all PeopleSoft customers as part of PeopleTools 8.51) provides a meta data driven approach to recording and executing test scripts. Coupled with what our Usage Monitor enables, PTF provides our customers a powerful tool to lower costs and manage product updates more efficiently and at the time of their choosing. Beyond the general session, we broke out into the individual CABs: HCM, Financials, ESA/ALM, SRM, SCM, CRM, and PeopleTools/ Technology. A day and half of very engaging discussions around our plans took place for each product pillar. More about that to follow in future posts.      We capped the first day with a reception sponsored by our partners: InfoSys, SmartERP (represented by Doris Wong), and Grey Sparling  Solutions (represented by Chris Heller and Larry Grey). Great to see these old friends actively engaged in the very busy PeopleSoft ecosystem!   Jeff Robbins previews the roadmap for PeopleTools with the PeopleSoft CAB  

    Read the article

  • Planning for the Recovery

    - by john.orourke(at)oracle.com
    As we plan for 2011, there are many positive signs in the global economy, but also some lingering issues. Planning no longer is about extrapolating past performance and adjusting for growth. It is now about constantly testing the temperature of the water, formulating scenarios, assessing risk and assigning probabilities.  So how does one plan for recovery and improve forecast accuracy in such a volatile environment?  Here are some suggestions from a recent article I wrote, which was published in the December Financial Planning & Analysis (FP&A) newsletter from the AFP (Association of Financial Professionals): Increase the frequency of forecasting Get more line managers involved in the planning and forecasting process Re-consider what's being measured - i.e. key financial and operational metrics Incorporate risk and probability into forecasts Reduce reliance on spreadsheets - leverage packaged EPM applications To learn more about these best practices, check out the FP&A section of the AFP website and register to receive the FP&A newsletter.  AFP recently launched a new topic area focused on the FP&A function and items of interest to this group of finance professionals.  In addition to the FP&A quarterly newsletter, AFP will be publishing articles, running webinars and will have an FP&A track in their annual conference, which is in Boston next November.  Brian Kalish, AFP's Finance Lead, is hoping this initiative creates a valuable networking and information-sharing resource for FP&A professionals. Here's a link to the FP&A page on the AFP web site:  http://www.afponline.org/pub/res/topics/topics_fpa.html If you register on the site you can access and subscribe to the FP&A newsletter and other resources. Best of luck in your planning for 2011 and beyond!   

    Read the article

  • Frederick .NET User Group March Meeting

    - by John Blumenauer
    FredNUG is pleased to announce that we have an excellent speaker lined up for March.  On March 16th, we’ll start with pizza and social networking at 6:30 PM.  Then, starting at 7 PM, Roberto Hernandez will present “Ramp up on ASP.NET MVC2!” The scheduled agenda is:   6:30 PM - 7:00 PM - Pizza/Social Networking/Announcements 7:00 PM - 8:30 PM - Main Topic: Ramp up on ASP.NET MVC2! with Roberto Hernandez  Main Topic Description:  Learn about the new features in the ASP.NET MVC2 Beta release. Learn how Areas, Templates, Validation, Model Metadata, and Strongly typed URL Helpers will improve the development experience on the ASP.NET MVC stack. Speaker Bio: Roberto Hernandez is currently a senior consultant for Excella Consulting, who has been designing and writing software using Microsoft technology for the past the 10 years. Originally from the Dominican Republic, Roberto is one of the founders of the NetaWeb community, the leading .NET developers community in the region. You can read more about Roberto at his blog at http://www.overridethis.com.            8:30 PM - 8:45 PM – RAFFLE! Please join us and get involved in our .NET developers community!

    Read the article

  • Application Lifecycle Management Tools

    - by John K. Hines
    Leading a team comprised of three former teams means that we have three of everything.  Three places to gather requirements, three (actually eight or nine) places for customers to submit support requests, three places to plan and track work. We’ve been looking into tools that combine these features into a single product.  Not just Agile planning tools, but those that allow us to look in a single place for requirements, work items, and reports. One of the interesting choices is Software Planner by Automated QA (the makers of Test Complete).  It's a lovely tool with real end-to-end process support.  We’re probably not going to use it for one reason – cost.  I’m sure our company could get a discount, but it’s on a concurrent user license that isn’t cheap for a large number of users.  Some initial guesswork had us paying over $6,000 for 3 concurrent users just to get started with the Enterprise version.  Still, it’s intuitive, has great Agile capabilities, and has a reputation for excellent customer support. At the moment we’re digging deeper into Rational Team Concert by IBM.  Reading the docs on this product makes me want to submit my resume to Big Blue.  Not only does RTC integrate everything we need, but it’s free for up to 10 developers.  It has beautiful support for all phases of Scrum.  We’re going to bring the sales representative in for a demo. This marks one of the few times that we’re trying to resist the temptation to write our own tool.  And I think this is the first time that something so complex may actually be capably provided by an external source.   Hooray for less work! Technorati tags: Scrum Scrum Tools

    Read the article

  • Will Google penalize subdomains if content is nearly identical

    - by John Pham
    I have created a subdomain for a town in San Diego that's ranking very well for it's keywords: http://carmelvalleymortgage.loanrebateinc.com/ I want to replicate this subdomain's content for another town in San Diego: http://sandiego.mortgage.loanrebateinc.com/ I will edit the text, tags, image files specific to each town, otherwise the verbiage will be identical. Question: Will Google penalize the main site? Will Google penalize the subdomains and list the content as spam? If yes to either 1 or 2, what strategies can I implement to prevent this? I'm using WordPress.

    Read the article

  • Correct term for PSD to HTML to CMS

    - by John Magnolia
    Hi, I have heard a lot of different terms to describe the process of turning a website design into a editable CMS. Currently I take the design and "slice" this up into HTML and CSS then I "plug" this into a CMS. I would class this as frontend development depending on the level of customisation required for the CMS. The reason I ask is I am currently writing up my CV and have become stuck on the correct term for this. Kind Regards

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >