Search Results

Search found 11093 results on 444 pages for 'issues'.

Page 93/444 | < Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >

  • Windows Azure Training Kit October 2012 Release

    - by Clint Edmonson
    The Windows Azure Technical Evangelism team have been busy bees lately and we want to share with you what they’ve been working on. As you know we release the Windows Azure Training Kit on a regular cadence, so I’m pleased to announce the Windows Azure Training Kit October 2012 Release. This update of the training kit includes 47 hands-on labs, 24 demos and 38 presentations designed to help you learn how to build applications that use Windows Azure services, including updated hands-on labs to use the latest version of Visual Studio 2012 and Windows 8, new demos and presentations. Essential Links: Windows Azure Training Kit Download Windows Azure Training Kit Github [Issues] Updated Presentations With Speaker Notes Your voices were heard loud and clear! I am excited to announce Speaker Notes have been added to a the majority of the content we have available. Find the new updated decks which contain speaker notes below: Foundation SQL Federation Virtual Machine Overview Virtual Networks Windows 8 and Windows Azure Web Sites Windows Azure Cloud Services Windows Azure Overview Windows Azure Service Bus Deploying Active Directory Building Apps With IaaS and PaaS Identity and Access Control Linux Virtual Machines Managing Virtual Machines PowerShell Migrating Apps and Workloads Scalable Global and Highly Available Apps Security and Identity SQL Database SQL Database Migration Cloud Service Life Cycle DevCamps Cloud Services iOS, Android and Windows Azure Windows 8 and Windows Azure Web Sites Windows 8 and Windows Azure Mobile Services Added Localized Content Due to the excitement in the community surrounding the mobile services launch, it was apparent that we needed to make localized content available to continue to deliver the exciting message around Windows Azure Mobile Services. Localized content is available in the following languages: French Japanese German Chinese (Taiwan) Spanish Italian Korean Portuguese (Brazilian) Russian Updated Hands-On Labs To support those who have upgraded to Visual Studio 2012 or those trying out the Visual Studio 2012 Express Editions, we have made sure that the content is available and supported (selected labs only) in Visual Studio 2012 Express and up. Visual Studio 2012 Windows Azure Traffic Manager Introduction to Cloud Services Service Bus Messaging Introduction to Access Control Service This adds a significant amount of additional content, so we have revamped the Hands-On Lab Navigation page to include subsections for Visual Studio 2012 Labs, Visual Studio 2010 Labs, Open Source Labs, Scenario Labs, All Labs. Added Demos Demos are available for a number of presentations which are available in Foundation, DevCamp, ITPro Event & Device + Service DevCamps. You can browse through the demos on the respective Demo Navigation page or on Github (links provided in Demo listing below). HelloASP Connecting Cloud Services Service Bus Relay Windows 8 and Mobile Services URL Shortener iOS Client Migrating a Web Farm Deploying Active Directory URL Shortener Service  (PHP) Geo-Location Service (PHP) Geo-Location Android Client Getting Started with VMs Load Balancing Availability Deploying Hybrid Apps Migrate VM AppController Geo-Location iOS Client Scale Up/Down Using CSUpload URL Shortener Android Client Imaging Virtual Machines The Windows Azure Training Kit is open source and available on GitHub, enabling you in the community to Report Issues or Fork and either extend the solution or commit bug fixes back to the Training Kit. You can find out more details about  the training kit from our GitHub Page including guidelines on how to commit back to the project. Stay tuned to my twitter feed for Windows Azure and other Microsoft announcements, updates, and links: @clinted

    Read the article

  • Google I/O 2010: Google TV Keynote, Day 2 - CEO Partner Panel

    Google I/O 2010: Google TV Keynote, Day 2 - CEO Partner Panel Google I/O 2010: Google TV Keynote, Day 2 - CEO Partner Panel Due to licensing and permissions issues, we are unable to show the full Google TV demonstration from the Day 2 keynote at Google I/O. Until we are able to get these permissions, please check out these clips. For Google I/O session videos, presentations, developer interviews and more, go to: code.google.com/io From: GoogleDevelopers Views: 7 0 ratings Time: 22:43 More in Science & Technology

    Read the article

  • 30 Great Photoshop Tips and Tricks to Help Your Computer Graphic Skills

    - by Lori Kaufman
    Photoshop is a powerful, but complex, graphics program that can be difficult to learn and frustrating to use. We have published many articles about tips and tricks for using Photoshop and how to fix annoying issues you may encounter. This article compiles 30 of the best tips and tricks we have documented to help you get the most out of Photoshop. How to Make the Kindle Fire Silk Browser *Actually* Fast! Amazon’s New Kindle Fire Tablet: the How-To Geek Review HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS

    Read the article

  • StreamInsight 1.0 Released

    - by Roman Schindlauer
    One piece in the set of products offered in SQL Server 2008 R2 that has generated a lot of buzz and interest during its CTP phase is StreamInsight, Microsoft’s platform for Complex Event Processing. Microsoft’s information platform vision provides enterprises with a “complete approach” to managing information assets, enabling all businesses to gain strategic value from information from the desktop to the datacenter to the cloud. And StreamInsight V1 is one essential piece in this spectrum. After more than a year of blood, sweat, tears, and insane amounts of coffee we are proud to release the first version of our Complex Event Processing Framework.   Those of you who have been following our Community Technology Previews (CTPs) throughout last year have already had the possibility to familiarize themselves with the product. Early feedback was not only incredibly positive, but also very constructive and strongly influenced the final feature set. Four notable increments over our last public CTP are: Count windows Non-occurrence detection (Anti-Join) Dynamic query composition at runtime Synchronize time across input streams Additionally, many smaller issues and bugs were addressed. A few APIs slightly changed with respect to the November CTP, but porting your application to RTM should not require a lot of effort.   Here are the (english) bits - choosing the evaluation license during setup lets you already play with this version. Before you install, make sure to uninstall any previous CTP version:   StreamInsight X86StreamInsight X64   Within a few days, we will update our product page and add download links and instructions there as well.   The StreamInsight documentation is provided through a help file as part of the installation as well as through Books Online on MSDN. We also invite you to visit the StreamInsight Blog and the StreamInsight Forum, which is a great place to discuss questions and issues with the community and the development team.   Regards,Roman Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Encouraging software engineers to track time

    - by M. Dudley
    How can I encourage my coworkers to track the time they spend resolving issues and implementing features? We have software to do this, but they just don't enter the numbers. I want the team to get better at providing project estimates by comparing our past estimates to actual time spent. I suspect that my coworkers don't see the personal benefit, since they're not often involved in project scheduling.

    Read the article

  • Do you care about your Oracle System Support experience?

    - by user12244613
    It has been a while since I blogged about Systems Support within Oracle. I want to take this opportunity to raise awareness of how Oracle is communicating out to its systems customers. Previously every item to be communicated was sent independently via an email message however, not all messages appear to be being getting the attention they require. In an effort to ensure Oracle is reaching all of our Sun and Oracle System customers, we have created the Oracle Systems Support Newsletter. This monthly newsletter will have a summary of customer support relevant information for you to use and will cover topics that impact your support experience. For example: 1. Did you know that sending explorer content to email addresses with @sun.com is going away soon? For more information, review the Document 1362484.1 2. Are you an Auto Service Request (ASR) user? If yes, here are the latest changes: · ASR Manager accepts My Oracle Support User Name (email address) and password. [Doc ID 1345484.1] · ASR IP Address for secure file transfer has changed [Doc ID 1338575.1] · ASR No Heartbeat Status - Find out how to resolve [Doc ID 1346328.1] 3. Did you notice we have changed the Service Request options for Hardware and introduced a new problem category called “Automated Diagnosis”? This service streamlines the data you send in and then automatically provides an update of known issues found in your My Oracle Support Service Request. This feature also fast tracks hardware failures by sending parts as soon as the data is analyzed. Have you used this new feature? If yes tell us about it – take the 5minute survey 4. Are you being proactive or are you still ‘fire fighting’ in the reactive mode? If you are being proactive for your Oracle System products you might have used Oracle Sun System Analysis. Did you finding this helpful? Can we improve it? You tell us, take the 5minute survey 5. Are you aware that if you attach files to your Service Request it enables the support engineer to start work straight away? For a summary of products and files review the Newsletter. 6. Are you struggling to find patches or firmware or product downloads? If yes, these types of issues are all addressed in the Newsletter. If this is the type of information you want to know about each month, then take time to read the Newsletter link and bookmark it in My Oracle Support so you can stay informed. Thanks for your time.

    Read the article

  • An update on using Rosetta Stone: Studio now isn't very useful and is not great value as an add-on option

    - by Greg Low
    I had a surprisingly large number of responses from my previous posting about learning Chinese. An update for those considering Rosetta Stone (www.rosettastone.com) for Chinese, Spanish or any other language that they offer:I had to renew my "Studio" subscription today and it's now a much worse deal than it was.It's now $75 for 6 months for Studio sessions. Online classes used to be 45 mins. Recently they reduced them to 20 mins. Given how often people have connection issues, etc. that 20 mins can disappear very quickly.They've also reduced the number you can attend. You used to be able to have 2 scheduled at any point in time. Now they limit you to 2 "group sessions" per month during the period. (You can pay for additional private sessions). The combination of these two changes now makes it much less useful. Two x 20 min sessions per month is an almost meaningless amount of practice. They also now automatically change you to auto-renew when you subscribe. They tell you where to remove this auto-renewal but the first 4 or 5 times that I went into that screen, no such option appeared. Later, an option did appear and I used it.Overall, things just aren't what they used to be at Rosetta Stone. It's now pretty hard to recommend the Studio option where it was a no-brainer before.FURTHER UPDATE: <sigh>Even after I renewed, I could not even connect to their "new" service. Although the system processed the renewal, it still tells me it's expired. My online chat person "Siva S" tells me that the problem is that I've purchased all 5 levels of the program. I can't wait till they explain to me how making an extra purchase from them stops me from logging on. Siva told me that they had "renewed" the program. I'd have to speak to Customer Care; they aren't available and then disconnected himself. Impressive (not).Their website is now full of issues too. It insists that my billing address is in the USA, even though it pretends to accept changes to it.Overall, it's gone from something that could be recommended (with some limitations) to now being an app to avoid. That's a pity as I liked much of it before.

    Read the article

  • Google I/O 2010: Google TV Keynote - Under The Hood

    Google I/O 2010: Google TV Keynote - Under The Hood Due to licensing and permissions issues, we are unable to show the full Google TV demonstration from the Day 2 keynote at Google I/O. Until we are able to get these permissions, please check out these clips. For Google I/O session videos, presentations, developer interviews and more, go to: code.google.com/io From: GoogleDevelopers Views: 3 0 ratings Time: 02:02 More in Science & Technology

    Read the article

  • Wired Lan problem In ubuntu 12.04

    - by user137935
    I am new to Ubuntu I have installed Ubuntu 12.04 as partition (with dual boot) using Wubi. In that my WiFi is working fine but LAN is being detected but its not working. I have tried some of the solutions like: I opened: gksu gedit /etc/network/interfaces I added these lines to the file: auto eth0 iface eth0 inet dhcp Then I restarted the network: sudo service networking restart But still the problem persists. Why does Ubuntu 12.04 has issues with LAN when installed? any help would be appreciated

    Read the article

  • Will Ubuntu work on a Dell Inspiron 15R?

    - by Daniel Davis
    I want to know how to install Ubuntu on a Dell Inspiron 15R with Intel Core i3, 3GB RAM, Intel HD graphics and 320GB HDD. I've heard some people have had issues with the wireless card. Also, I tried a few days ago to install the 64-bit version of Ubuntu 10.10 and the installaten hung on the "Who are you" screen... Can anyone help me, I really want to get into Linux but without having any problems with the laptops drivers.

    Read the article

  • Installing MOSS 2007 on Windows 2008 R2

    - by Manesh Karunakaran
    When you try to install MOSS 2007 on Windows 2008 R2, if you are using an installation media that is older than SP2, you would get the following error, saying that “This program is blocked due to compatibility issues”    All is not lost though, all you need to do is to slip stream the SP2 updates to the MOSS 2007 Setup. Here’s a nice how to on how to do that. http://blogs.technet.com/seanearp/archive/2009/05/20/slipstreaming-sp2-into-sharepoint-server-2007.aspx Once you slipstream the SP2 updates, you would be able to continue with the installation with out the above error. HTH.   You may already read from blogs about April Cumulative Update for separate components in SharePoint. Now, the server-packages (also known as “Uber” packages) of April Cumulative Update for Microsoft Office SharePoint Server 2007 and Windows SharePoint Services 3.0 are ready for download. Download Information Windows SharePoint Services 3.0 April cumulative update package http://support.microsoft.com/hotfix/KBHotfix.aspx?kbnum=968850 Office SharePoint Server 2007 April cumulative update package http://support.microsoft.com/hotfix/KBHotfix.aspx?kbnum=968851 Detail Description Description of the Windows SharePoint Services 3.0 April cumulative update package http://support.microsoft.com/kb/968850 Description of the Office SharePoint Server 2007 April cumulative update package http://support.microsoft.com/kb/968851 Installation Recommendation for a fresh SharePoint Server To keep all files in a SharePoint installation up-to-date, the following sequence is recommended. Service Pack 2 for Windows SharePoint Services 3.0 Service Pack 2 for Office SharePoint Server 2007 April Cumulative Update package for Windows SharePoint Services 3.0 April Cumulative Update package for Office SharePoint Server 2007 Please note: Start from April Cumulative Update, the packages will no longer install on a farm without a service pack installed. You must have installed either Service Pack 1 (SP1) or SP2 prior to the installation of the cumulative updates. After applying the preceding updates, run the SharePoint Products and Technologies Configuration Wizard or “psconfig –cmd upgrade –inplace b2b -wait” in command line. This needs to be done on every server in the farm with SharePoint installed.  The version of content databases should be 12.0.6504.5000 after successfully applying these updates. For more in-depth guidance for the update process, we recommend that customers refer to the following articles. These articles provide a correct way to deploy updates, identify known issues (and resolutions), and provide information about creating slipstream builds. Deploy software updates for Windows SharePoint Services 3.0 http://technet.microsoft.com/en-us/library/cc288269.aspx Deploy software updates for Office SharePoint Server 2007 http://technet.microsoft.com/en-us/library/cc263467.aspx Create an installation source that includes software updates (Windows SharePoint Services 3.0) http://technet.microsoft.com/en-us/library/cc287882.aspx Create an installation source that includes software updates (Office SharePoint Server 2007) http://technet.microsoft.com/en-us/library/cc261890.aspx

    Read the article

  • java.net outage

    - by alexismp
    The GlassFish website has been down for a number of hours (together with a number of other projects) as a result of a general outage in the java.net datacenter. The team is working hard on getting everything back to normal. You can track progress by following @ProjectKenai. Update: services should now all be back to normal. If you face java.net issues in the future, consider reporting them here. And now, back to work!

    Read the article

  • Windows Azure Diagnostics: Next to Useless?

    - by Your DisplayName here!
    To quote my good friend Christian: “Tracing is probably one of the most discussed topics in the Windows Azure world. Not because it is freaking cool – but because it can be very tedious and partly massively counter-intuitive.” <rant> The .NET Framework has this wonderful facility called TraceSource. You define a named trace and route that to a configurable listener. This gives you a lot of flexibility – you can create a single trace file – or multiple ones. There is even nice tooling around that. SvcTraceViewer from the SDK let’s you open the XML trace files – you can filter and sort by trace source and event type, aggreate multiple files…blablabla. Just what you would expect from a decent tracing infrastructure. Now comes Windows Azure. I was already were grateful that starting with the SDK 1.2 we finally had a way to do tracing and diagnostics in the cloud (kudos!). But the way the Azure DiagnosticMonitor is currently implemented – could be called flawed. The Azure SDK provides a DiagnosticsMonitorTraceListener – which is the right way to go. The only problem is, that way this works is, that all traces (from all sources) get written to an ETW trace. Then the DiagMon listens to these traces and copies them periodically to your storage account. So far so good. But guess what happens to your nice trace files: the trace source names get “lost”. They appear in your message text at the end. So much for filtering and sorting and aggregating (regex #fail or #win??). Every trace line becomes an entry in a Azure Storage Table – the svclog format is gone. So much for the existing tooling. To solve that problem, one workaround was to write your own trace listener (!) that creates svclog files inside of local storage and use the DiagMon to copy those. Christian has a blog post about that. OK done that. Now it turns out that this mechanism does not work anymore in 1.3 with FullIIS (see here). Quoting: “Some IIS 7.0 logs not collected due to permissions issues...The root cause to both of these issues is the permissions on the log files.” And the workaround: “To read the files yourself, log on to the instance with a remote desktop connection.” Now then have fun with your multi-instance deployments…. </rant>

    Read the article

  • iPad2 - Yet Another Fundamental Defect in an Apple product

    - by Kit Ong
    First it was antenna defect in iPhone4 now it has been reported that some iPad 2 have display issues, Apple really needs to look at their manufacturing process. It doesn't help that workers are working like robots in their main supplier's factory Foxconn. More info on reported display light bleeding http://www.cultofmac.com/if-your-ipad-2-has-display-problems-do-not-return-it-heres-why/87197   How to check your iPad for dead pixel / light leak / bleed http://www.theipadguide.com/content/ipad-dead-pixel-test-how/7171269

    Read the article

  • CAPTCHA blocking for my scraping script?

    - by Surabhil Sergy
    I am working on a scraping project which involves getting web data and parsing them for further use. I have been working using PHP and CURL to make scraping scripts which crawls web data and I make use of either PHP Dom or Simple HTML DOM Parser library for these kinds of projects. On a recent project I encountered some challenges; initially I found the target website blocked my server IP such that the server could not make any successful requests to the site. Understanding these issues as common I bought a set of private proxies and tried to make request calls using them. Though this could get successful response, I noticed the script is getting some kind of blocks after 2-3 continuous requests. On printing and checking the response I could see a pop-up asking for CAPTCHA validation. I could not see any captcha characters to be entered and it also shows an error “input error: invalid referrer”. On examining the source I could see some Google recaptcha scripts within. I’m stuck at this point and I m not able to execute my script. My script is used for gathering data and it needs to go through a large number of pages periodically over the site. But in the current scenario I am not able to proceed with my script. I could see there are some options to overcome these captcha issues and scraping these kinds of sites too are common. I have been checking my script performance and responses over last two months. I could see during first month I was able to execute very large number of requests from a single IP and I was able to get results. Later I get an IP block and used private proxies which could get me some results. Later I am facing now with the captcha trouble. I would appreciate any help or suggestions in this regard. (Often in this kind of questions I used to get a first comment as, ‘Have you asked for prior permission from the target?’ .I haven’t ,but I know there are many sites doing so to get the details out of sites and target sites may not often give access to them. I respect the legality and scraping etiquettes but I would like to know at what point I stuck and how could I overcome that! ) I could provide any supporting information if needed.

    Read the article

  • Webpage loading with wrong content-type after setting up CloudFlare

    - by Daniel Little
    I recently migrated my blog to the Ghost service, I've also setup an alias DNS record with CloudFlare. While showing the blog to a colleague I discovered one of the posts wasn't loading properly and would instead prompt to be downloaded with an application/octet-stream content-type. I can view all the pages without any issues and I believe we're both on the same network as well. Has anyone received a wrong content type like application/octet-stream using CloudFlare, or know what I can do to correct this?

    Read the article

  • Five Key Strategies in Master Data Management

    - by david.butler(at)oracle.com
    Here is a very interesting Profit Magazine article on MDM: A recent customer survey reveals the deleterious effects of data fragmentation. by Trevor Naidoo, December 2010   Across industries and geographies, IT organizations have grown in complexity, whether due to mergers and acquisitions, or decentralized systems supporting functional or departmental requirements. With systems architected over time to support unique, one-off process needs, they are becoming costly to maintain, and the Internet has only further added to the complexity. Data fragmentation has become a key inhibitor in delivering flexible, user-friendly systems. The Oracle Insight team conducted a survey assessing customers' master data management (MDM) capabilities over the past two years to get a sense of where they are in terms of their capabilities. The responses, by 27 respondents from six different industries, reveal five key areas in which customers need to improve their data management in order to get better financial results. 1. Less than 15 percent of organizations surveyed understand the sources and quality of their master data, and have a roadmap to address missing data domains. Examples of the types of master data domains referred to are customer, supplier, product, financial and site. Many organizations have multiple sources of master data with varying degrees of data quality in each source -- customer data stored in the customer relationship management system is inconsistent with customer data stored in the order management system. Imagine not knowing how many places you stored your customer information, and whether a customer's address was the most up to date in each source. In fact, more than 55 percent of the respondents in the survey manage their data quality on an ad-hoc basis. It is important for organizations to document their inventory of data sources and then profile these data sources to ensure that there is a consistent definition of key data entities throughout the organization. Some questions to ask are: How do we define a customer? What is a product? How do we define a site? The goal is to strive for one common repository for master data that acts as a cross reference for all other sources and ensures consistent, high-quality master data throughout the organization. 2. Only 18 percent of respondents have an enterprise data management strategy to ensure that data is treated as an asset to the organization. Most respondents handle data at the department or functional level and do not have an enterprise view of their master data. The sales department may track all their interactions with customers as they move through the sales cycle, the service department is tracking their interactions with the same customers independently, and the finance department also has a different perspective on the same customer. The salesperson may not be aware that the customer she is trying to sell to is experiencing issues with existing products purchased, or that the customer is behind on previous invoices. The lack of a data strategy makes it difficult for business users to turn data into information via reports. Without the key building blocks in place, it is difficult to create key linkages between customer, product, site, supplier and financial data. These linkages make it possible to understand patterns. A well-defined data management strategy is aligned to the business strategy and helps create the governance needed to ensure that data stewardship is in place and data integrity is intact. 3. Almost 60 percent of respondents have no strategy to integrate data across operational applications. Many respondents have several disparate sources of data with no strategy to keep them in sync with each other. Even though there is no clear strategy to integrate the data (see #2 above), the data needs to be synced and cross-referenced to keep the business processes running. About 55 percent of respondents said they perform this integration on an ad hoc basis, and in many cases, it is done manually with the help of Microsoft Excel spreadsheets. For example, a salesperson needs a report on global sales for a specific product, but the product has different product numbers in different countries. Typically, an analyst will pull all the data into Excel, manually create a cross reference for that product, and then aggregate the sales. The exact same procedure has to be followed if the same report is needed the following month. A well-defined consolidation strategy will ensure that a central cross-reference is maintained with updates in any one application being propagated to all the other systems, so that data is synchronized and up to date. This can be done in real time or in batch mode using integration technology. 4. Approximately 50 percent of respondents spend manual efforts cleansing and normalizing data. Information stored in various systems usually follows different standards and formats, making it difficult to match the data. A customer's address can be stored in different ways using a variety of abbreviations -- for example, "av" or "ave" for avenue. Similarly, a product's attributes can be stored in a number of different ways; for example, a size attribute can be stored in inches and can also be entered as "'' ". These types of variations make it difficult to match up data from different sources. Today, most customers rely on manual, heroic efforts to match, cleanse, and de-duplicate data -- clearly not a scalable, sustainable model. To solve this challenge, organizations need the ability to standardize data for customers, products, sites, suppliers and financial accounts; however, less than 10 percent of respondents have technology in place to automatically resolve duplicates. It is no wonder, therefore, that we get communications about products we don't own, at addresses we don't reside, and using channels (like direct mail) we don't like. An all-too-common example of a potential challenge follows: Customers end up receiving duplicate communications, which not only impacts customer satisfaction, but also incurs additional mailing costs. Cleansing, normalizing, and standardizing data will help address most of these issues. 5. Only 10 percent of respondents have the ability to share data that was mastered in a master data hub. Close to 60 percent of respondents have efforts in place that profile, standardize and cleanse data manually, and the output of these efforts are stored in spreadsheets in various parts of the organization. This valuable information is not easily shared with the rest of the organization and, more importantly, this enriched information cannot be sent back to the source systems so that the data is fixed at the source. A key benefit of a master data management strategy is not only to clean the data, but to also share the data back to the source systems as well as other systems that need the information. Aside from the source systems, another key beneficiary of this data is the business intelligence system. Having clean master data as input to business intelligence systems provides more accurate and enhanced reporting.  Characteristics of Stellar MDM When deciding on the right master data management technology, organizations should look for solutions that have four main characteristics: enterprise-grade MDM performance complete technology that can be rapidly deployed and addresses multiple business issues end-to-end MDM process management with data quality monitoring and assurance pre-built MDM business relevant applications with data stores and workflows These master data management capabilities will aid in moving closer to a best-practice maturity level, delivering tremendous efficiencies and savings as well as revenue growth opportunities as a result of better understanding your customers.  Trevor Naidoo is a senior director in Industry Strategy and Insight at Oracle. 

    Read the article

  • Post-retirement plans for Ruby package in Precise 12.04LTS?

    - by Alexandr Kurilin
    Now that Ruby 1.8.7 is officially retired, is it at all possible that the default ruby package in precise might be updated all the way up to 2.0? I understand that backporting is generally not done with Ubuntu releases for concerns of breaking people, but in this case we're dealing with a deprecated package that might either already have security issues, or have more of them discovered in the future without any chance of them being fixed. What's the plan? Will 12.04 march on with 1.8.7?

    Read the article

  • Google I/O 2010: Google TV Keynote - Push Android Apps From Web To TV

    Google I/O 2010: Google TV Keynote - Push Android Apps From Web To TV Due to licensing and permissions issues, we are unable to show the full Google TV demonstration from the Day 2 keynote at Google I/O. Until we are able to get these permissions, please check out these clips. For Google I/O session videos, presentations, developer interviews and more, go to: code.google.com/io From: GoogleDevelopers Views: 1 0 ratings Time: 02:09 More in Science & Technology

    Read the article

  • Hot Off the Presses! Get Your Release of the October Procurement Newsletter!

    - by LuciaC
    Get all the recent news and featured topics for the Procurement modules including Purchasing, iProcurement, Sourcing and iSupplier. Find out what Procurement experts are recommending to prevent and resolve issues.  Important links are also included.  The October newsletter features articles on: The new Procurement Enhancement Request Community Procurement Community Development Corner Updated version of the PO Approvals Analyzer Uploading Files And there is much more….. Access the newsletter now: Doc ID 111111.1

    Read the article

  • SQL – Biggest Concerns in a Data-Driven World

    - by Pinal Dave
    The ongoing chaos over Government Agency’s snooping has ignited a heated debate on privacy of personal data and its use by government and/or other institutions. It has created a feeling of disapproval and distrust among users. This incident proves to be a lesson for companies that are looking to leverage their business using a data driven approach. According to analysts, the goal of gathering personal information should be to deliver benefits to both the parties – the user as well as the data collector(government or business). Using data the right way is crucial, and companies need to deploy the right software applications and systems to ensure that their efforts are well-directed. However, there are various issues plaguing analysts regarding available software, which are highlighted below. According to a InformationWeek 2013 Survey of Analytics, Business Intelligence and Information Management where 541 business technology professionals contributed as respondents, it was discovered that the biggest concern was deemed to be the scarcity of expertise and high costs associated with the same. This concern was voiced by as many as 38% of the participants. A close second came out to be the issue of data warehouse appliance platforms being expensive, with 33% of those present believing it to be a huge roadblock. Another revelation made in this respect was that 31% professionals weren’t even sure how Data Analytics can create business opportunities for them. Another 17% shared that they found data platform technologies such as Hadoop and NoSQL technologies hard to learn. These results clearly pointed out that there are awareness and expertise issues that also need much attention. Unless the demand-supply gap of Business Intelligence professionals well versed in data analysis technologies is met, this divide is going to affect how companies make the most of their BI campaigns. One of the key action points that can be taken to salvage the situation, is to provide training on Data Analytics concepts. Koenig Solutions offer courses on many such technologies including a course on MCSE SQL Server 2012: BI Platform. So it’s time to brush up your skills and get down to work in a data driven world that awaits you ahead. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • thunar-archive-plugin not working

    - by Sergio
    After I experienced serious not yet resolved performance issues with Nautilus I decided to move to XUbuntu so I installed its metapackage from Ubuntu and started using it. It turns out that the archive plugin for Thunar (provides the "Extract here" option in the contextual menu when right clicking over a compressed archive) is not working, even after I apt-get purged it and reinstalled. It simply doesn't show its options in the contextual menu. What should I do to make it work?

    Read the article

  • What are the advantages of GLSL's compilation model?

    - by Kos
    GLSL is fundamentally different from other shader solutions because the server (GPU driver) is responsible for shader compilation. Cg and HLSL are (afaik) generally compiled a priori and sent to the GPU in that way. This causes some real-world practical issues: many drivers provide buggy compilers compilers differ in terms of strictness (one GPU can accept a program while another won't) also we can't know how the assembler code will be optimised What are the upsides of GLSL's current approach? Is it worth it?

    Read the article

< Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >