Search Results

Search found 7086 results on 284 pages for 'proprietary syntax'.

Page 94/284 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • Unity 3D won't load - Choosing "Ubuntu" during login still loads Unity 2D

    - by Shasteriskt
    I just installed Ubuntu 11.10 on my ASUS N43SL laptop and Unity 3D loaded right after installation. But upon reboot, I noticed that Ubuntu loaded Unity 2D instead of Unity 3D. I logged-off and made sure I had "Ubuntu" selected during login, but after many attempts Unity 3D just won't load. Just to make it definite, I did echo $DESKTOP_SESSION on the terminal and it says ubuntu-2d, which is not expected. My laptop runs on i5 with 1Gb NVIDIA GT540M so I don't think its a case of lacking graphics capability. And yes, I have installed the proprietary driver using "Additional Drivers" in "System Settings". What can be the cause of this problem? How can it be fixed?

    Read the article

  • HP Compaq 2510p wireless disabled by hardware switch

    - by mylovelyhorse
    I have an HP Compaq 2510p laptop running ubuntu 12.04 LTS. ubuntu reports that wireless is disabled by means of a hardware switch. There is a 'soft-key' button on the laptop to control the physical wireless hardware but this does not respond. There is no other button, slider, (fn)+ combination to control the physical wireless hardware. There is no BIOS function to disable wireless (and on XP - previous OS - wireless functioned fine). mike@ubuntu:~$ rfkill list all 0: brcmwl-0: Wireless LAN Soft blocked: no Hard blocked: yes 1: hp-wifi: Wireless LAN Soft blocked: no Hard blocked: no Running rfkill unblock all doesn't change things and I can see no way to change use from 0: to 1: (if that's even possible - or desirable - in the first place). I have checked for additional drivers and the Broadcom proprietary wireless driver is already installed and has a green light. Essentially, I believe I need to make the HP 'soft-keys' work - or at least the wireless card toggle. Advice gratefully received. Cheers M

    Read the article

  • Tips on Migrating from AquaLogic .NET Accelerator to WebCenter WSRP Producer for .NET

    - by user647124
    This year I embarked on a journey to migrate a group of ASP.NET web applications developed to integrate with WebLogic Portal 9.2 via the AquaLogic® Interaction .NET Application Accelerator 1.0 to instead use the Oracle WebCenter WSRP Producer for .NET and integrated with WebLogic Portal 10.3.4. It has been a very winding path and this blog entry is intended to share both the lessons learned and relevant approaches that led to those learnings. Like most journeys of discovery, it was not a direct path, and there are notes to let you know when it is practical to skip a section if you are in a hurry to get from here to there. For the Curious From the perspective of necessity, this section would be better at the end. If it were there, though, it would probably be read by far fewer people, including those that are actually interested in these types of sections. Those in a hurry may skip past and be none the worst for it in dealing with the hands-on bits of performing a migration from .NET Accelerator to WSRP Producer. For others who want to talk about why they did what they did after they did it, or just want to know for themselves, enjoy. A Brief (and edited) History of the WSRP for .NET Technologies (as Relevant to the this Post) Note: This section is for those who are curious about why the migration path is not as simple as many other Oracle technologies. You can skip this section in its entirety and still be just as competent in performing a migration as if you had read it. The currently deployed architecture that was to be migrated and upgraded achieved initial integration between .NET and J2EE over the WSRP protocol through the use of The AquaLogic Interaction .NET Application Accelerator. The .NET Accelerator allowed the applications that were written in ASP.NET and deployed on a Microsoft Internet Information Server (IIS) to interact with a WebLogic Portal application deployed on a WebLogic (J2EE application) Server (both version 9.2, the state of the art at the time of its creation). At the time this architectural decision for the application was made, both the AquaLogic and WebLogic brands were owned by BEA Systems. The AquaLogic brand included products acquired by BEA through the acquisition of Plumtree, whose flagship product was a portal platform available in both J2EE and .NET versions. As part of this dual technology support an adaptor was created to facilitate the use of WSRP as a communication protocol where customers wished to integrate components from both versions of the Plumtree portal. The adapter evolved over several product generations to include a broad array of both standard and proprietary WSRP integration capabilities. Later, BEA Systems was acquired by Oracle. Over the course of several years Oracle has acquired a large number of portal applications and has taken the strategic direction to migrate users of these myriad (and formerly competitive) products to the Oracle WebCenter technology stack. As part of Oracle’s strategic technology roadmap, older portal products are being schedule for end of life, including the portal products that were part of the BEA acquisition. The .NET Accelerator has been modified over a very long period of time with features driven by users of that product and developed under three different vendors (each a direct competitor in the same solution space prior to merger). The Oracle WebCenter WSRP Producer for .NET was introduced much more recently with the key objective to specifically address the needs of the WebCenter customers developing solutions accessible through both J2EE and .NET platforms utilizing the WSRP specifications. The Oracle Product Development Team also provides these insights on the drivers for developing the WSRP Producer: ***************************************** Support for ASP.NET AJAX. Controls using the ASP.NET AJAX script manager do not function properly in the Application Accelerator for .NET. Support 2 way SSL in WLP. This was not possible with the proxy/bridge set up in the existing Application Accelerator for .NET. Allow developers to code portlets (Web Parts) using the .NET framework rather than a proprietary framework. Developers had to use the Application Accelerator for .NET plug-ins to Visual Studio to manage preferences and profile data. This is now replaced with the .NET Framework Personalization (for preferences) and Profile providers. The WSRP Producer for .NET was created as a new way of developing .NET portlets. It was never designed to be an upgrade path for the Application Accelerator for .NET. .NET developers would create new .NET portlets with the WSRP Producer for .NET and leave any existing .NET portlets running in the Application Accelerator for .NET. ***************************************** The advantage to creating a new solution for WSRP is a product that is far easier for Oracle to maintain and support which in turn improves quality, reliability and maintainability for their customers. No changes to J2EE applications consuming the WSRP portlets previously rendered by the.NET Accelerator is required to migrate from the Aqualogic WSRP solution. For some customers using the .NET Accelerator the challenge is adapting their current .NET applications to work with the WSRP Producer (or any other WSRP adapter as they are proprietary by nature). Part of this adaptation is the need to deploy the .NET applications as a child to the WSRP producer web application as root. Differences between .NET Accelerator and WSRP Producer Note: This section is for those who are curious about why the migration is not as pluggable as something such as changing security providers in WebLogic Server. You can skip this section in its entirety and still be just as competent in performing a migration as if you had read it. The basic terminology used to describe the participating applications in a WSRP environment are the same when applied to either the .NET Accelerator or the WSRP Producer: Producer and Consumer. In both cases the .NET application serves as what is referred to as a WSRP environment as the Producer. The difference lies in how the two adapters create the WSRP translation of the .NET application. The .NET Accelerator, as the name implies, is meant to serve as a quick way of adding WSRP capability to a .NET application. As such, at a high level, the .NET Accelerator behaves as a proxy for requests between the .NET application and the WSRP Consumer. A WSRP request is sent from the consumer to the .NET Accelerator, the.NET Accelerator transforms this request into an ASP.NET request, receives the response, then transforms the response into a WSRP response. The .NET Accelerator is deployed as a stand-alone application on IIS. The WSRP Producer is deployed as a parent application on IIS and all ASP.NET modules that will be made available over WSRP are deployed as children of the WSRP Producer application. In this manner, the WSRP Producer acts more as a Request Filter than a proxy in the WSRP transactions between Producer and Consumer. Highly Recommended Enabling Logging Note: You can skip this section now, but you will most likely want to come back to it later, so why not just read it now? Logging is very helpful in tracking down the causes of any anomalies during testing of migrated portlets. To enable the WSRP Producer logging, update the Application_Start method in the Global.asax.cs for your .NET application by adding log4net.Config.XmlConfigurator.Configure(); IIS logs will usually (in a standard configuration) be in a sub folder under C:\WINDOWS\system32\LogFiles\W3SVC. WSRP Producer logs will be found at C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdefault\Logs\WSRPProducer.log InputTrace.webinfo and OutputTrace.webinfo are located under C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdefault and can be useful in debugging issues related to markup transformations. Things You Must Do Merge Web.Config Note: If you have been skipping all the sections that you can, now is the time to stop and pay attention J Because the existing .NET application will become a sub-application to the WSRP Producer, you will want to merge required settings from the existing Web.Config to the one in the WSRP Producer. Use the WSRP Producer Master Page The Master Page installed for the WSRP Producer provides common, hiddenform fields and JavaScripts to facilitate portlet instance management and display configuration when the child page is being rendered over WSRP. You add the Master Page by including it in the <@ Page declaration with MasterPageFile="~/portlets/Resources/MasterPages/WSRP.Master" . You then replace: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" > <HTML> <HEAD> With <asp:Content ID="ContentHead1" ContentPlaceHolderID="wsrphead" Runat="Server"> And </HEAD> <body> <form id="theForm" method="post" runat="server"> With </asp:Content> <asp:Content ID="ContentBody1" ContentPlaceHolderID="Main" Runat="Server"> And finally </form> </body> </HTML> With </asp:Content> In the event you already use Master Pages, adapt your existing Master Pages to be sub masters. See Nested ASP.NET Master Pages for a detailed reference of how to do this. It Happened to Me, It Might Happen to You…Or Not Watch for Use of Session or Request in OnInit In the event the .NET application being modified has pages developed to assume the user has been authenticated in an earlier page request there may be direct or indirect references in the OnInit method to request or session objects that may not have been created yet. This will vary from application to application, so the recommended approach is to test first. If there is an issue with a page running as a WSRP portlet then check for potential references in the OnInit method (including references by methods called within OnInit) to session or request objects. If there are, the simplest solution is to create a new method and then call that method once the necessary object(s) is fully available. I find doing this at the start of the Page_Load method to be the simplest solution. Case Sensitivity .NET languages are not case sensitive, but Java is. This means it is possible to have many variations of SRC= and src= or .JPG and .jpg. The preferred solution is to make these mark up instances all lower case in your .NET application. This will allow the default Rewriter rules in wsrp-producer.xml to work as is. If this is not practical, then make duplicates of any rules where an issue is occurring due to upper or mixed case usage in the .NET application markup and match the case in use with the duplicate rule. For example: <RewriterRule> <LookFor>(href=\"([^\"]+)</LookFor> <ChangeToAbsolute>true</ChangeToAbsolute> <ApplyTo>.axd,.css</ApplyTo> <MakeResource>true</MakeResource> </RewriterRule> May need to be duplicated as: <RewriterRule> <LookFor>(HREF=\"([^\"]+)</LookFor> <ChangeToAbsolute>true</ChangeToAbsolute> <ApplyTo>.axd,.css</ApplyTo> <MakeResource>true</MakeResource> </RewriterRule> While it is possible to write a regular expression that will handle mixed case usage, it would be long and strenous to test and maintain, so the recommendation is to use duplicate rules. Is it Still Relative? Some .NET applications base relative paths with a fixed root location. With the introduction of the WSRP Producer, the root has moved up one level. References to ~/ will need to be updated to ~/portlets and many ../ paths will need another ../ in front. I Can See You But I Can’t Find You This issue was first discovered while debugging modules with code that referenced the form on a page from the code-behind by name and/or id. The initial error presented itself as run-time error that was difficult to interpret over WSRP but seemed clear when run as straight ASP.NET as it indicated that the object with the form name did not exist. Since the form name was no longer valid after implementing the WSRP Master Page, the likely fix seemed to simply update the references in the code. However, as the WSRP Master Page is external to the code, a compile time error resulted: Error      155         The name 'form1' does not exist in the current context                C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdefault\portlets\legacywebsite\module\Screens \Reporting.aspx.cs                51           52           legacywebsite.module Much hair-pulling research later it was discovered that it was the use of the FindControl method causing the issue. FindControl doesn’t work quite as expected once a Master Page has been introduced as the controls become embedded in controls, require a recursion to find them that is not part of the FindControl method. In code where the page form is referenced by name, there are two steps to the solution. First, the form needs to be referenced in code generically with Page.Form. For example, this: ToggleControl ctrl = new ToggleControl(frmManualEntry, FunctionLibrary.ParseArrayLst(userObj.Roles)); Becomes this: ToggleControl ctrl = new ToggleControl(Page.Form, FunctionLibrary.ParseArrayLst(userObj.Roles)); Generally the form id is referenced in most ASP.NET applications as a path to a control on the form. To reach the control once a MasterPage has been added requires an additional method to recurse through the controls collections within the form and find the control ID. The following method (found at Rick Strahl's Web Log) corrects this very nicely: public static Control FindControlRecursive(Control Root, string Id) { if (Root.ID == Id) return Root; foreach (Control Ctl in Root.Controls) { Control FoundCtl = FindControlRecursive(Ctl, Id); if (FoundCtl != null) return FoundCtl; } return null; } Where the form name is not referenced, simply using the FindControlRecursive method in place of FindControl will be all that is necessary. Following the second part of the example referenced earlier, the method called with Page.Form changes its value extraction code block from this: Label lblErrMsg = (Label)frmRef.FindControl("lblBRMsg" To this: Label lblErrMsg = (Label) FunctionLibrary.FindControlRecursive(frmRef, "lblBRMsg" The Master That Won’t Step Aside In most migrations it is preferable to make as few changes as possible. In one case I ran across an existing Master Page that would not function as a sub-Master Page. While it would probably have been educational to trace down why, the expedient process of updating it to take the place of the WSRP Master Page is the route I took. The changes are highlighted below: … <asp:ContentPlaceHolder ID="wsrphead" runat="server"></asp:ContentPlaceHolder> </head> <body leftMargin="0" topMargin="0"> <form id="TheForm" runat="server"> <input type="hidden" name="key" id="key" value="" /> <input type="hidden" name="formactionurl" id="formactionurl" value="" /> <input type="hidden" name="handle" id="handle" value="" /> <asp:ScriptManager ID="ScriptManager1" runat="server" EnablePartialRendering="true" > </asp:ScriptManager> This approach did not work for all existing Master Pages, but fortunately all of the other existing Master Pages I have run across worked fine as a sub-Master to the WSRP Master Page. Moving On In Enterprise Portals, even after you get everything working, the work is not finished. Next you need to get it where everyone will work with it. Migration Planning Providing that the server where IIS is running is adequately sized, it is possible to run both the .NET Accelerator and the WSRP Producer on the same server during the upgrade process. The upgrade can be performed incrementally, i.e., one portlet at a time, if server administration processes support it. Those processes would include the ability to manage a second producer in the consuming portal and to change over individual portlet instances from one provider to the other. If processes or requirements demand that all portlets be cut over at the same time, it needs to be determined if this cut over should include a new producer, updating all of the portlets in the consumer, or if the WSRP Producer portlet configuration must maintain the naming conventions used by the .NET Accelerator and simply change the WSRP end point configured in the consumer. In some enterprises it may even be necessary to maintain the same WSDL end point, at which point the IIS configuration will be where the updates occur. The downside to such a requirement is that it makes rolling back very difficult, should the need arise. Location, Location, Location Not everyone wants the web application to have the descriptively obvious wsrpdefault location, or needs to create a second WSRP site on the same server. The instructions below are from the product team and, while targeted towards making a second site, will work for creating a site with a different name and then remove the old site. You can also change just the name in IIS. Manually Creating a WSRP Producer Site Instructions (NOTE: all executables used are the same ones used by the installer and “wsrpdev” will be the name of the new instance): 1. Copy C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdefault to C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdev. 2. Bring up a command window as an administrator 3. Run C:\Oracle\Middleware\WSRPProducerForDotNet\uninstall_resources\IISAppAccelSiteCreator.exe install WSRPProducers wsrpdev "C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdev" 8678 2.0.50727 4. Run C:\Oracle\Middleware\WSRPProducerForDotNet\uninstall_resources\PermManage.exe add FileSystem C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdev "NETWORK SERVICE" 3 1 5. Run C:\Oracle\Middleware\WSRPProducerForDotNet\uninstall_resources\PermManage.exe add FileSystem C:\Oracle\Middleware\WSRPProducerForDotNet\wsrpdev EVERYONE 1 1 6. Open up C:\Oracle\Middleware\WSRPProducerForDotNet\wsdl\1.0\WSRPService.wsdl and replace wsrpdefault with wsrpdev 7. Open up C:\Oracle\Middleware\WSRPProducerForDotNet\wsdl\2.0\WSRPService.wsdl and replace wsrpdefault with wsrpdev Tests: 1. Bring up a browser on the host itself and go to http://localhost:8678/wsrpdev/wsdl/1.0/WSRPService.wsdl and make sure that the URLs in the XML returned include the wsrpdev changes you made in step 6. 2. Bring up a browser on the host itself and see if the default sample comes up: http://localhost:8678/wsrpdev/portlets/ASPNET_AJAX_sample/default.aspx 3. Register the producer in WLP and test the portlet. Changing the Port used by WSRP Producer The pre-configured port for the WSRP Producer is 8678. You can change this port by updating both the IIS configuration and C:\Oracle\Middleware\WSRPProducerForDotNet\[WSRP_APP_NAME]\wsdl\1.0\WSRPService.wsdl. Do You Need to Migrate? Oracle Premier Support ended in November of 2010 for AquaLogic Interaction .NET Application Accelerator 1.x and Extended Support ends in November 2012 (see http://www.oracle.com/us/support/lifetime-support/lifetime-support-software-342730.html for other related dates). This means that integration with products released after November of 2010 is not supported. If having such support is the policy within your enterprise, you do indeed need to migrate. If changes in your enterprise cause your current solution with the .NET Accelerator to no longer function properly, you may need to migrate. Migration is a choice, and if the goals of your enterprise are to take full advantage of newer technologies then migration is certainly one activity you should be planning for.

    Read the article

  • NVIDIA graphics driver on Macbook Pro 10,1

    - by Boatzart
    I just installed 14.04 over my old 12.04 partition on my Macbook Pro 10,1 (which is dual-booting with OS X) by following the instructions here. The only difference is that I'm using rEFInd instead of rEFIt. The proprietary NVIDIA drivers worked great with 12.04, but now I'm unable to boot into Unity with it in 14.04. Generally, I just get a black screen after the Grub menu, though occasionally I get some kind of panic screen like this, where I see errors like: [drm: __gen6_gt_force_wake_mt_get] *ERROR* Timed out waiting for forcewake old ack to clear. [drm: __gen6_gt_wait_for_thread_c0] *ERROR* GT thread status wait timed out [drm: __intel_ring_setup_status_page] *ERROR* render ring: wait for SyncFlush to complete for TLB invalidation timed out etc. Using the nouveau drivers works fine, but everything feels sluggish so I would really like to get the NVIDIA drivers working. Has anyone successfully gotten the NVIDIA drivers working with the GT-650M Mac Edition?

    Read the article

  • Bad performance with ATI Radeon X1300?

    - by stighy
    Hi, i'm having problem with Ubuntu 10.04 and my Ati radeon X1300. In particular i can't enable effect (compiz) because they are SLOW, and, for example, the same game (hedgewars) on the same pc run very slowly on Linux, nor in Windows. With my old Ubuntu (9.04) i didn't have the same problem. Does anyone help me to "configure" the right driver for my video card ? I've tested with proprietary (fglrx) and open (xorg..-ati-radeon)... Either give me some problem :(! Thank you!

    Read the article

  • No alternative drivers appearing on Software Sources and manual install leads to no unity

    - by Gausie
    I just got a new laptop, installed Ubuntu 12.10 and am trying to install proprietary nvidia drivers. Once I understood the change from jockey, I did a fresh install and followed these instructions: http://techhamlet.com/2012/11/install-nvidia-drivers-in-ubuntu-12-10/ But when I do, Unity crashes on startup. My hardware on lspci | grep VGA is as follows 00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GK107 [GeForce GT 650M] (rev a1) I've followed a couple of nvidia 12.10 tutorials on Google but none have helped. Can I get any specific advice?

    Read the article

  • Unique Business Value vs. Unique IT

    - by barry.perkins
    When the age of computing started, technology was new, exciting, full of potential and had a long way to grow. Vendor architectures were proprietary, and limited in function at first, growing in capability and complexity over time. There were few if any "standards", let alone "open standards" and the concepts of "open systems", and "open architectures" were far in the future. Companies employed intelligent, talented and creative people to implement the best possible solutions for their company. At first, those solutions were "unique" to each company. As time progressed, standards emerged, companies shared knowledge, business capability supplied by technology grew, and companies continued to expand their use of technology. Taking advantage of change required companies to struggle through periodic "revolutionary" change cycles, struggling through costly changes that were fraught with risk, resulted in solutions with an increasingly shorter half-life, and frequently required altering existing business processes and retraining employees and partner businesses. The pace of technological invention and implementation grew at an ever increasing rate, making the "revolutionary" approach based upon "proprietary" or "closed" architectures or technologies no longer viable. Concurrent with the advancement of technology, the rate of change in business increased, leading us to the incredibly fast paced, highly charged, and competitive global economy that we have today, where the most successful companies are companies that are good at implementing, leveraging and exploiting change. Fast forward to today, a world where dramatic changes in business and technology happen continually, a world where "evolutionary" change is crucial. Companies can no longer afford to build "unique IT", nor can they afford regular intervals of "revolutionary" change, with the associated costs and risks. Human ingenuity was once again up to the task, turning technology into a platform supporting business through evolutionary change, by employing "open": open standards; open systems; open architectures; and open solutions. Employing "open", enables companies to implement systems based upon technology, capability and standards that will evolve over time, providing a solid platform upon which a company can drive business needs, requirements, functions, and processes down into the technology, rather than exposing technology to the business, allowing companies to focus on providing "unique business value" rather than "unique IT". The big question! Does moving from "older" technology that no longer meets the needs of today's business, to new "open" technology require yet another "revolutionary change"? A "revolutionary" change with a short half-life, camouflaging reality with great marketing? The answer is "perhaps". With the endless options available to choose from, it is entirely possible to implement a solution that may work well today, but in 5 years time will become yet another albatross for the company to bear. Some solutions may look good today, solving a budget challenge by reducing cost, or solving a specific tactical challenge, but result in highly complex environments, that may be difficult to manage and maintain and limit the future potential of your business. Put differently, some solutions might push today's challenge into the future, resulting in a more complex and expensive solution. There is no such thing as a "1 size fits all" IT solution for business. If all companies implemented business solutions based upon technology that required, or forced the same business processes across all businesses in an industry, it would be extremely difficult to show competitive advantage through "unique business value". It would be equally difficult to "evolve" to meet or exceed business needs and keep up with today's rapid pace of change. How does one ensure that they do not jump from one trap directly into another? Or to put it positively, there are solutions available today that can address these challenges and issues. How does one ensure that the buying decision of today will serve the business well for years into the future? Intelligent & Informed decisions - "buying right" In a previous blog entry, we discussed the value of linking tactical to strategic The key is driving the focus to what is best for your business, handling today's tactical issues while also aligning with a roadmap/strategy that is tightly aligned with your strategic business objectives. When considering the plethora of possible options that provide various approaches to solving today's complex business problems, it is extremely important to ensure that vendors supplying those options, focus on what is best for your business, supplying sufficient information, providing adequate answers to questions, addressing challenges, issues, concerns and objections honestly and openly, and focus on supplying solutions that are tailored for, and deliver the most business value possible for your business. Here are a few questions to consider relative to the proposed options that should help ensure that today's solution doesn't become tomorrow's problem. Do the proposed solutions: Solve the problem(s) you are trying to address? Provide a solid foundation upon which to grow/enhance your business? Provide tactical gains that align with and enable your strategic business goals/objectives? Provide an infrastructure that can be leveraged with subsequent projects? Solve problems for the business overall, the lines of business, or just IT? Simplify your current environment Provide the basis for business: Efficiency Agility Clarity governance, risk, compliance real time business visibility and trend analysis Does your IT staff have the knowledge/experience to successfully manage the proposed systems once they are deployed in production? Done well, you will be presented with options tailored to your business, that enable you to drive the "unique business value" necessary to help your business stand out from others, creating a distinct competitive advantage, delivering what your customers need, when they need it, so you can attract new customers, new business, and grow top line revenue, all at a cost that provides a strong Return on Investment/Return on Assets. The net result is growth with managed cost providing significantly improved profit margin and shareholder value.

    Read the article

  • What is your opinion on free software? [closed]

    - by Joe D
    I use mostly free software on my main machine, and most programs that I write come out under either the GPL or BSD. I dislike proprietary software and prefer not to use it if a free software alternative is viable (read as: good enough). What are your opinions on free software? Do you use or develop for it? Why do you use it? Why don't you? Are you as extremist as RMS, and use only free software?

    Read the article

  • Screen brightness control not working on Lenovo T530

    - by Matt
    My brightness control doesn't work with a fresh install of 12.10 (brand new laptop). It is set to the brightest setting when I boot up and when I try to change it, I see the notification bar come up but the brightness doesn't actually change. I've tried all the solutions I could find around the Internet but none of them work. Things I have tried include: Editing /sys/class/backlight/acpi_video0/brightness In /usr/share/X11/xorg.conf.d/10-brightness-control.conf: Option "RegistryDwords" "EnableBrightnessControl=1" In /etc/default/grub: GRUB_CMDLINE_LINUX_DEFAULT="quiet splash acpi_osi=Linux acpi_backlight=vendor" There is no xorg.conf file in 12.10 that I have found, so the solutions that suggest editing that file don't do me a whole lot of good. I am currently using the Nouveau driver, but switching to the Nvidia proprietary drivers made no difference. Any other ideas? When is this bug going to be fixed? With all the reports I've come across I would think it would get a lot of attention. Thanks.

    Read the article

  • How to sync tasks between Ubuntu and Android?

    - by andrewsomething
    I was using first Tasque and then GTG on Ubuntu and Astrid on Android with Remember The Milk as the backend. Recently, Astrid has dropped support for Remember The Milk, and both Tasque and GTG's support for syncing with RTM has always left something to be desired. So I'm looking for a new solution, and I'm open to leaving RTM especially if it is for something that isn't proprietary. What have you found to keep your tasks lists in-sync between Ubuntu and Android? In particular, I'm looking for a desktop based program on the Ubuntu side rather than something in the browser. Astrid has treated me well, but I wouldn't mind trying something else. Currently, it can sync with Astrid.com, Google Tasks, and Producteev. Anyone know of a desktop app that supports any of those?

    Read the article

  • Login screen restarts while entering password

    - by Shane L
    I am having a problem that only occurred after installing the fglrx proprietary driver through the additional drivers app. The exact same issue is described in this question, however it's closed. Must login twice before entering Unity; first login screen has graphical anomalies When I boot up my computer, once I get to lightdm login screen I will start typing my password, and upon the entering of two of my password characters "n2" the screen will start displaying some corruption in the very top edge of the screen, then after I hit the number 2, lightdm seems to crash and it will restart and everything is fine, I can login from that point. So the issue is, every time I reboot, I have to login once, allow lightdm to crash and restart, then login as normal. The issue did not occur prior to installing fglrx, and has happened through several reinstalls.

    Read the article

  • dual-boot (win-xp/ubu12.04) graphics card for ubu-desktop/win-xp-games

    - by iole1
    for work I need to get a a new and cheap graphics card for a dual boot machine: windows xp/ubuntu 12.04 LTS. The only requirements I have are: it should work 'flawlessly' in ubuntu (proprietary drivers are ok) it should handle Guild Wars 2 & League of Legends in windows xp (this is really the top priority as we need to be able to play at work :) - yes I have a cool job) I know nothing about graphics cards (and it seems to be a jungle out there). From other questions here and some webstigation I think I'd like to go for a Nvidia card, I've been trying to figure out what models fit the system req's but it seems they use different kind of model numbers so I don't get any wiser. tl;dr: will http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gt-620-oem/specifications run Guild Wars 2 http://gamesystemrequirements.com/games.php?id=938 Or what is the worst card from nVidia that will run GW2 smoothly and work well in Ubuntu 12.04 Thanks!

    Read the article

  • Dell Inspiron 1120 Ubuntu Light -> Desktop and now I'm having problems with wifi and suspend

    - by David N. Welton
    I got a Dell Inspiron 1120 which ships with Ubuntu Light, as well as Windows. My wife prefers Ubuntu, but obviously outside of web stuff, you can't do a lot with Light, so I went ahead and installed the Desktop version of Ubuntu (10.10 / maverick). Whereas before it suspended beautifully and connected to wifi networks flawlessly, it now displays the following problems: It seems to suspend ok, but on resume, the screen remains blank, even though the computer appears to wake up again. Wifi doesn't connect. I tried using the suggested proprietary drivers, and those don't seem to change the situation. All in all, a bit frustrating to run into these sorts of "regressions" - does anyone know what sort of drivers and such Ubuntu Light might have shipped with for this computer that made it work so well? Unfortunately, I wiped the disk in order to install the Desktop version of Ubuntu.

    Read the article

  • How do I get TV out working on an ATI X800 GTO²?

    - by Liran
    Is there any way to make my graphics card (ATI x800) work with tvout? I know there are no proprietary drivers for this card only the opensource ones. It uses this cable for video out: http://warningdang.dynalias.com/magento/media/catalog/product/cache/1/image/265x265/9df78eab33525d08d6e5fb8d27136e95/a/3/a31b_1.jpg Personally I use the yellow RCA for tvout (in windows). I hope there is a way to work it out because it's a really important feature for me and one of the only reasons I boot windows. Thanks ahead!

    Read the article

  • Corrupt plymouth boot screen when using properietary Nvidia drivers

    - by Tejas Hosangadi
    I recently got the final version of Ubuntu Natty Narwhal x64 and everything is working just fine. The only thing I am having trouble with is the boot screen. After I installed the proprietary nVidia drivers, the boot screen becomes corrupt and only shows text (as it has been doing since Ubuntu 10.04). When I followed the instructions to fix the boot screen in 10.04 and 10.10, the boot screen still remains the same and the problem isn't resolved. Please give a full guide on how to fix this problem. Machine Details Computer : HP Pavilion p6240in Processor : Core 2 duo Memory : 4096 megabytes Graphics : nVidia G210 3d graphics Monitor : Dell 17 inch lcd Another question I would like to ask is whether it is possible to use this script (http://www.webupd8.org/2010/10/script-to-fix-ubuntu-plymouth-for.html) with 11.04. I am asking this question because it says that the script works only with 10.04 and 10.10.

    Read the article

  • Take Steps to Mitigate the Threat of Insiders

    - by Troy Kitch
    Register now for our upcoming Feb 23 Webcast The Insider Threat, Understand and Mitigate Your Risks. Insiders, by virtue of legitimate access to their organizations' information and IT infrastructure, pose a significant risk to employers. Employees, motivated by financial problems, greed, revenge, the desire to obtain a business advantage, or the wish to impress a new employer, have stolen confidential data, proprietary information, or intellectual property from their employers. Since this data typically resides in databases, organizations need to consider a database security defense in depth approach that takes into account preventive and detective controls to protect their data against abuse by insiders. Register now and learn about: Actual cases of insider cyber crimes Three primary types of insider cyber crimes: IT sabotage, theft of intellectual property (e.g. trade secrets), and employee fraud Lack of controls around data that allow these crimes to be successful Solutions to help secure data and database infrastructure

    Read the article

  • SQLAuthority News – Various Microsoft SQL Server Documentations Available for Download

    - by pinaldave
    Microsoft has recently released various SQL Server related documentations and here I have listed them here for quick reference. Microsoft SQL Server Protocol Documentation The Microsoft SQL Server protocol documentation provides technical specifications for Microsoft proprietary protocols that are implemented and used in Microsoft SQL Server 2008. Microsoft SQL Server Protocol Documentation The SQL Server data portability documentation explains various mechanisms by which user-created data in SQL Server can be extracted for use in other software products. These mechanisms include import/export functionality, documented APIs, industry standard formats, or documented data structures/file formats. SQL Server Standards Support Documentation The SQL Server standards support documentation provides detailed support information for certain standards that are implemented in Microsoft SQL Server. Microsoft Product Support Reports Download the scripted system configuration gathering tools. The Microsoft Product Support Reports utility facilitates the gathering of critical system and logging information used in troubleshooting support issues. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Is Linear Tape File System (LTFS) Best For Transportable Storage?

    - by rickramsey
    Those of us in tape storage engineering take a lot of pride in what we do, but understand that tape is the right answer to a storage problem only some of the time. And, unfortunately for a storage medium with such a long history, it has built up a few preconceived notions that are no longer valid. When I hear customers debate whether to implement tape vs. disk, one of the common strikes against tape is its perceived lack of usability. If you could go back a few generations of corporate acquisitions, you would discover that StorageTek engineers recognized this problem and started developing a solution where a tape drive could look just like a memory stick to a user. The goal was to not have to care about where files were on the cartridge, but to simply see the list of files that were on the tape, and click on them to open them up. Eventually, our friends in tape over at IBM built upon our work at StorageTek and Sun Microsystems and released the Linear Tape File System (LTFS) feature for the current LTO5 generation of tape drives as an open specification. LTFS is really a wonderful feature and we’re proud to have taken part in its beginnings and, as you’ll soon read, its future. Today we offer LTFS-Open Edition, which is free for you to use in your in Oracle Enterprise Linux 5.5 environment - not only on your LTO5 drives, but also on your Oracle StorageTek T10000C drives. You can download it free from Oracle and try it out. LTFS does exactly what its forefathers imagined. Now you can see immediately which files are on a cartridge. LTFS does this by splitting a cartridge into two partitions. The first holds all of the necessary metadata to create a directory structure for you to easily view the contents of the cartridge. The second partition holds all of the files themselves. When tape media is loaded onto a drive, a complete file system image is presented to the user. Adding files to a cartridge can be as simple as a drag-and-drop just as you do today on your laptop when transferring files from your hard drive to a thumb drive or with standard POSIX file operations. You may be thinking all of this sounds nice, but asking, “when will I actually use it?” As I mentioned at the beginning, tape is not the right solution all of the time. However, if you ever need to physically move data between locations, tape storage with LTFS should be your most cost-effective and reliable answer. I will give you a few use cases examples of when LTFS can be utilized. Media and Entertainment (M&E), Oil and Gas (O&G), and other industries have a strong need for their storage to be transportable. For example, an O&G company hunting for new oil deposits in remote locations takes very large underground seismic images which need to be shipped back to a central data center. M&E operations conduct similar activities when shooting video for productions. M&E companies also often transfers files to third-parties for editing and other activities. These companies have three highly flawed options for transporting data: electronic transfer, disk storage transport, or tape storage transport. The first option, electronic transfer, is impractical because of the expense of the bandwidth required to transfer multi-terabyte files reliably and efficiently. If there’s one place that has bandwidth, it’s your local post office so many companies revert to physically shipping storage media. Typically, M&E companies rely on transporting disk storage between sites even though it, too, is expensive. Tape storage should be the preferred format because as IDC points out, “Tape is more suitable for physical transportation of large amounts of data as it is less vulnerable to mechanical damage during transportation compared with disk" (See note 1, below). However, tape storage has not been used in the past because of the restrictions created by proprietary formats. A tape may only be readable if both the sender and receiver have the same proprietary application used to write the file. In addition, the workflows may be slowed by the need to read the entire tape cartridge during recall. LTFS solves both of these problems, clearing the way for tape to become the standard platform for transferring large files. LTFS is open and, as long as you’ve downloaded the free reader from our website or that of anyone in the LTO consortium, you can read the data. So if a movie studio ships a scene to a third-party partner to add, for example, sounds effects or a music score, it doesn’t have to care what technology the third-party has. If it’s written back to an LTFS-formatted tape cartridge, it can be read. Some tape vendors like to claim LTFS is a “standard,” but beauty is in the eye of the beholder. It’s a specification at this point, not a standard. That said, we’re already seeing application vendors create functionality to write in an LTFS format based on the specification. And it’s my belief that both customers and the tape storage industry will see the most benefit if we all follow the same path. As such, we have volunteered to lead the way in making LTFS a standard first with the Storage Network Industry Association (SNIA), and eventually through to standard bodies such as American National Standards Institute (ANSI). Expect to hear good news soon about our efforts. So, if storage transportability is one of your requirements, I recommend giving LTFS a look. It makes tape much more user-friendly and it’s free, which allows tape to maintain all of its cost advantages over disk! Note 1 - IDC Report. April, 2011. “IDC’s Archival Storage Solutions Taxonomy, 2011” - Brian Zents Website Newsletter Facebook Twitter

    Read the article

  • Unity very slow while Gnome Classic running just fine

    - by Sorin Sbarnea
    I see tons of people complaining about Unity speed and I think the problem is not with the video drivers. When I login to Gnome Classic the system is behaving just fine, but when on Unity I can barely do use it: windows are moved hard, terminal is damn slow. Is there any solution or bug that I should track? Details Ubuntu 11.10 Two monitors setup Latest Nvidia proprietary drivers (tested with default ones also, no change) 6GB RAM, Xeon @ 2.8 Nvidia Driver 280.13 - Quadro NVS 295 with 8 cores 256MB RAM. lspci | grep VGA 02:00.0 VGA compatible controller: nVidia Corporation G98 [Quadro NVS 295] (rev a1) uname -a Linux sorins 3.0.0-16-generic #29-Ubuntu SMP Tue Feb 14 12:48:51 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

    Read the article

  • Bug in firefox address bar autocomplete running on KDE

    - by marcus
    Has anyone experienced this graphical glitch when typing in Firefox address bar? The drop-down list is not drawn correctly, with some "blocks" missing. After typing more letters or hovering the mouse cursor, the list redraws itself and becomes complete. I'm running Ubuntu 12.04, Firefox 13.0.1 and this only happens in KDE (tested with 4.8.2, 4.8.3 and 4.8.4). It does not happen in Unity or Xfce with the same user profile. If I go to the KDE control panel and disable the Fade effect, the bug starts to happen to almost every menu in the system, including, the taskbar window previews. Enabling the “Fade” effect corrects the bug everywhere except in Firefox. I have an Nvidia card and I am using the proprietary driver (current, not current-updates -- not sure about the difference), but the linked question on an Arch Linux forum says this happen with the open source driver and with other cards too. Does anyone have an idea for a solution?

    Read the article

  • Are there any plans to create a standard based on libunity?

    - by Sebastian Billaudelle
    In the last few month and maybe even years, Ubuntu and Canonical often were criticised for developing software and desktop components without talking to other groups in the free software community. I don't want to comment on this topic, but I see problems arising with creating a "proprietary" solution for displaying indicators and progressbars with a launcher like Unity. In the world of free and open Desktop Environments we often try to standardize parts and libraries or write specifications to increase collaboration between different desktops. We have the instrument of http://freedesktop.org and a lot of specifications are getting implemented by the major Desktop Environments. In this context, proposing a standard for those indicators would be a great step towards better interoperability between desktops. These indicators represent a great feature on the Linux Desktop and I'm sure that other projects like AWN, Docky, etc. would pick them up. With the great market share of Ubuntu, Canonical is in a position to propose it as a standard and encourage projects to implement it. Thank you in advance, Sebastian Billaudelle

    Read the article

  • GPL: does one line of GPLed code make program a "derived work"

    - by SigTerm
    I've recently run into some argument with a person that claims to be a lawyer (I have my suspicions about this not being completely true, though). As far as I know, copying even one line of code from GPLed program into proprietary body of code requires you to release the whole thing under GPL, if you ever decide to publish the software and make it available to the public. The person in question claims that it is "absurd" (I know it is, but AFAIK that's how GPL works), it is "redefining the copyright", "GPL has no power to do that", and claiming that "one line of GPLed code makes you release the whole thing under GPL" is absurd. That contradicts the GPL FAQ. Can somebody clarify the situation? Am I right in assumption that copying even smallest subroutine from GPL program into your code automatically makes your program a "derived work" which means you are obliged to release it under GPL license if you publish it?

    Read the article

  • Cannot control the fan on a Sony Vaio laptop

    - by Xobb
    I've got Sony Vaio laptop and my fan is on all the time, though the temperature on the video-adapter is always over 60°?. As I've googled, vaiofand does not support VPC-EA series. Is there anything I can do with that or I need another laptop? I'm using the following graphics card: 01:00.0 VGA compatible controller: ATI Technologies Inc Manhattan [Mobility Radeon HD 5000 Series] though the problem is not with the graphics card. I've mentioned its temperature as the highest (64°C now btw). Seems like the notebook is always overheated and I cannot control the fan speed to make it cooler. And yes, I'm using the proprietary driver ATI/AMD FGLRX graphics driver. I'm using Kubuntu 11.04.

    Read the article

  • Methods to Manage/Document "one-off" Reports

    - by Jason Holland
    I'm a programmer that also does database stuff and I get a lot of so-called one-time report requests and recurring report requests. I work at a company that has a SQL Server database that we integrate third-party data with and we also have some third-party vendors that we have to use their proprietary reporting system to extract data in flat file format from that we don't integrate into SQL Server for security reasons. To generate many of these reports I have to query data from various systems, write small scripts to combine data from the separate systems, cry, pull my hair, curse the last guy's name that made the report before me, etc. My question is, what are some good methods for documenting the steps taken to generate these reports so the next poor soul that has to do them won't curse my name? As of now I just have a folder with subfolders per project with the selects and scripts that generated the last report but that seems like a "poor man's" solution. :)

    Read the article

  • Coded ui to measure performance

    - by Mike Weber
    I have been tasked with using coded UI to measure performance on a proprietary windows desktop application. The need is to measure how long it takes for the next page/screen to display after a user clicks on a control. For example - a user enters their ID and PW and clicks sign-in. The need is to measure how long it takes for the next screen to display when the user clicks the sign-in button. I understand the need to define what indicates the screen is loaded and ready for use. One approach is to use control.WaitForControlReady and use BeginTimer/EndTimer. Is coded ui a dependable and accurate way of measuring time? Is WaitForControlReady the best method to determine when a control is ready for use?

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >