Search Results

Search found 23021 results on 921 pages for 'process monitoring'.

Page 541/921 | < Previous Page | 537 538 539 540 541 542 543 544 545 546 547 548  | Next Page >

  • remote desktop: the app on the remote machine is confusing the controller's hostname with its own hostname

    - by David Dai
    I have 2 machines A, B, both run Windows OS. A is my work machine, B is a server on which I have already installed SQLServer. Now I want to install another software on B which runs on top of the SQLServer. I remote connect to B from A. Then on the remote desktop, I start the installer, along the installation process, there's a step where I can configure which server to connect to. normally B's hostname is entered automatically to the hostname field. The issue I'm having is, when I get to that step, A's hostname is entered automatically instead of B's, and even if I manually correct it to 'localhost' or '127.0.0.1' or B's hostname, the installer still cannot connect to B's service as if it still try to connect to A. Theoretically, how does this happen? how is this possible?

    Read the article

  • 2013 U.S. GAAP Financial Reporting Taxonomy Available for Public Review and Comment

    - by Theresa Hickman
    FASB recently released the proposed 2013 U.S. GAAP Reporting Taxonomy. Comments are due October 29, 2012 to be finalized and published early 2013.  The proposed 2013 U.S. GAAP taxonomy and instructions on how to submit comments are available at the FASB’s XBRL page. In previous blog entries, I talked about how Oracle Hyperion Disclosure Management supports the latest taxonomy, enabling financial managers to easily comply with the latest filing requirements. The taxonomy is a list of computer-readable tags in XBRL that allows companies to annotate the voluminous financial data that is included in typical long-form financial statements and related footnote disclosures. The tags allow computers to automatically search for, assemble, and process data so it can be readily accessed and analyzed by investors, analysts, journalists, and regulators. You do not have to have Oracle Hyperion Financial Management, used for consolidating financial results, to generate XBRL. You just need Oracle Hyperion Disclosure Management to generate XBRL instance documents from financial applications, such as Oracle E-Business Suite, Oracle PeopleSoft, Oracle JD Edwards EnterpriseOne, and Oracle Fusion General Ledger. To generate XBRL tags and complete SEC filings using your existing financial applications with Oracle Hyperion Disclosure Management, here are the steps: Download the XBRL taxonomy from the SEC or XBRL Website into Hyperion Disclosure Management to create a company taxonomy. Publish financial statements from the general ledger to Microsoft Excel or Microsoft Word. Create the SEC filing in the Microsoft programs and perform the XBRL tag mapping in Oracle Hyperion Disclosure Management. Ensure that the SEC filing meets XBRL and SEC EDGAR Filer Manual validation requirements. Validate and submit the company taxonomy and XBRL instance document to the SEC. Get more details about Oracle Hyperion Disclosure Management.

    Read the article

  • WolframAlpha Can Now Do In-depth Analysis of Your Facebook Account

    - by Jason Fitzpatrick
    If you’re a big fan of WolframAlpha’s ability to crunch the numbers on just about anything–and we certainly are–you’ll likely be just as delighted as we were to watch it massage the data from your Facebook account. Find out your most liked, discussed, and shared posts, see your Facebook habits, and other neat trends. I unleashed it on my account this morning, not sure what to expect from the results. Within the results tabulation WolframAlpha provided me with all sorts of neat data break downs. I now know exactly how many days it is to my next birthday, the composition of my aggregate posting habits (how many posts are status updates, links, or photos), the time of day when I do the most posting (and what the composition of those posts is), and my average post length. I also know my most liked post and my most commented on post. It will even crunch the numbers on your network of friends (60.6% of my friends are married, for example). By far one of the more interesting data analysis it does on the friendship data, however, is organizing all your friends into relationship clusters so you can see who in your Facebook network is friends with other people in your Facebook network. The service from WolframAlpha is free: simply visit the WolframAlpha search portal and type in “Facebook report” to start the process. You’ll be prompted to create a WolframAlpha account if you don’t have one and to authorize the WolframAlpha Facebook app to access your data. Your Facebook data is cached to your WolframAlpha account for one hour in order to crunch the numbers and display the results. WolframAlpha HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows? Java is Insecure and Awful, It’s Time to Disable It, and Here’s How

    Read the article

  • How to Kill and Alternate X session via cli

    - by L. D. James
    Can someone tell me how to remove dormant X sessions. This question is similar to Logging out other users from the command line, but more specific to controlling X displays which I find hard to kill. I used the command "who -u" to get the session of the other screens: $ who -u Which gave me: user1 :0 2014-08-18 12:08 ? 2891 (:0) user1 pts/26 2014-08-18 16:11 17:18 3984 (:0) user2 :1 2014-08-18 18:21 ? 25745 (:1) user1 pts/27 2014-08-18 23:10 00:27 3984 (:0) user1 pts/32 2014-08-18 23:10 10:42 3984 (:0) user1 pts/46 2014-08-18 23:14 00:04 3984 (:0) user1 pts/48 2014-08-19 04:10 . 3984 (:0) The kill -9 25745 doesn't appear to do anything. I have a workshop where a number of users will use the computer under their own login. After the workshop is over there are a number of logins that are left open. I would prefer to kill the open sessions rather than try to log into each users' screen. Again, this question isn't just about logging users' out. I'm hoping to get clarity also for killing/removing stuck processes that are hard to kill. New Info While still pondering how to kill the process I wrote the following script, which did it: #!/bin/bash results=1 while [[ $results > 0 ]] do sudo kill -9 25745 results=$? echo -ne "Response:$results..." sleep 20 done After a graceful waiting period, if there isn't a better answer I'll mark this as answered with this resolution. This may resolve the problem with other stuck processes I have had in the past.

    Read the article

  • Can you set CIFS permisions from EMC Command Line?

    - by TJ.
    I am in the process of migrating file shares from my EMC NS-20 to my new VNXe 3100. I am using a RoboCopy script to move the files but am getting errors on some files and folders. I have Domain Admin privileges but when I go to view the security permissions on the folders it says I don't have permissions. I have tried taking ownership to get around the permissions issue but that fails too. So as a last resort can I set permissions on this folder from the EMC console or Web management console?

    Read the article

  • Opportunities in Development in our Swedish office

    - by anca.rosu
    Hi everyone, my name is Henrik and I joined the JRockit group in 2004. Before that my background was Microsoft, as both a Test Competence lead and as a Program Manager. As an Engineering Manager at Oracle I lead a team of 11 developers. I focus on people management and the daily operations of the department with a heavy focus on interaction and dependencies between the groups and departments here at the Stockholm development site. I also make sure my team deliver on our commitments. I would like to give you a brief summary of the Oracle JRockit team: -The development group in Stockholm delivers several products for the Oracle Fusion Middleware stack. Our main products are JRockitVE which allows you to run a Java Virtual Machine without an operating system, the JRockit Java Virtual Machine which is the default jvm for all Oracle middleware products, and the JRockit MissionControl, a set of tools that allows developers to monitor their applications at runtime and perform advanced latency analysis as well as in-production memory leak detection etc. -The office has several departments focusing on different aspects of the product development process, not only to build features and test them but everything from building the infrastructure needed to automatically build and test the products to sustaining engineering that tracks down bugs in customer systems and provide them with patches. Some inspirational lines around what the Oracle JRockit group can offer you in terms of progress, development and learning: - It is a unique chance to get insight and experience building enterprise class software for one of the worlds largest software companies. Here there are almost unlimited possibilities for the right candidate to learn about silicon features and how to implement support for this in software, and to compile optimizations. The position will also give insight into the processes needed to produce software at this level in the industry. If you have any questions related to this article feel free to contact  [email protected].  You can find our job opportunities via http://campus.oracle.com. Technorati Tags: Development,Sweden,Jrockit,Java,Virtual Machine,Oracle Fusion Middleware,software

    Read the article

  • Siebel BIP Integration

    - by Tim Dexter
    This post is more of a bookmark for me so that I stop bugging the brown stuff out of the John the Siebel-BIP product manager. I have had multiple customers over the past two weeks asking for help around the integration. What's its capable of? How can I allow my users to click a button to run a BIP report? How can I kick off a report from a Siebel workflow? Start right here - this is a great white paper explaining whats now available with the integration using, the Siebel Report Business Service. Once you have consumed that from start to finish. Get on over to Oracle support and look for the following note that has code samples and lots of other good stuff! Siebel BI Publisher Reports Business Service (8.1.1.7+) [ID 1425724.1] The Reports Business Service enables BI Publisher reports to be executed from the Siebel application via a Workflow Process, or through scripting. The report is generated in the background by connecting to the BI Publisher server. The report output is stored in the Siebel File System and accessed from the My BI Publisher Reports view. Alternatively using appropriate methods, the report can be attached to an entity or sent to a particular delivery channel.

    Read the article

  • Quick Script for Adding Skype Groups

    - by Robert May
    So, I needed to add about 30 people to several different Skype groups today, and I didn’t want to repeat the /add [skypename] thing over and over and over.  Building the list was a pain . . . I couldn’t find a good way to extract all of the users in an existing group.  There’s probably an api or something, but I just did that part by hand. Adding them to the groups was pretty easy with Windows Scripting Host.  Basically, I just ran this: <package>    <job id="vbs">       <script language="VBScript">          set WshShell = WScript.CreateObject("WScript.Shell")          WshShell.AppActivate 4484          WScript.Sleep 100          WshShell.SendKeys "/add user1~"          WScript.Sleep 100 …          WshShell.SendKeys "/add usern~"          WScript.Sleep 100       </script>    </job> </package> Add as many users as you need by copying the sendkeys and sleep lines.  Then, save the script to a .wsf file.  The AppActivate line needs to be changed to have the process id of skype instead of the number there.  To get that, open up Task Manager, click on Processes, then find skype.exe and find it’s PID. Before you double click on the file in windows explorer, you’ll need to have created the groups in skype.  For each group, open the group, and click in the chat window of the group.  Then double click on the WSF file.  If you don’t click in the chat window, you will likely get the add user dialog box instead of just adding the users. Technorati Tags: Skype,Script

    Read the article

  • OS X will not register newly installed network adapters

    - by Chris
    I have purchased an Edimax 7318USg and tested it on a Windows machine (works). The installation process for the software for this adapter runs smoothly. However, OS X simply does not recognize new network adapters. When you go to System Preferences/Network, a new network adapter should be present/there should be an alert. This is not the case. Why might this be? Is there a setting I may reset to force the operating system to recognize this? Thanks!

    Read the article

  • What kind of code would Kent Beck avoid unit testing?

    - by tieTYT
    I've been watching a few of the Is TDD Dead? talks on youtube, and one of the things that surprised me is Kent Beck seems to acknowledge that there are just some kinds of programs that aren't worth unit testing. For example, right here DHH says that Kent Beck is ... very happy to say "Well, TDD doesn't fit in this case, I'm just going to bail" It's frustrating to me that Kent Beck seems to acknowledge this, but nobody asks him to elaborate on it or give concrete examples. I'd like to know the situations where Kent Beck thinks TDD is a bad fit. Nobody can read his mind or speak for him, but I'm hoping he's been transparent enough through his books/tweets/whatever for someone to be able to answer. I'm not necessarily going to take what he says as gospel, but it would be useful to know that the times I've tried TDD and it just felt impossible/useless are situations that he would have bailed on it himself. Or, if it turned out he would have tested that code it'd suggest to me that I was approaching the process very wrong. I also think it would be enlightening to understand why he would bail on such projects. My opinion on why this is not a duplicate of "When is it appropriate to not unit test?" After skimming those answers I'm not satisfied. For example, look at UncleBob's answer. He doesn't even acknowledge that such a situation exists. I really think there's value in understanding Kent Beck's position, not just a general, "What's your opinion?" type of question. After all, he's the father of TDD.

    Read the article

  • Problem upgrading kernel on debian 3.1

    - by exhuma
    Hi, I have a quite old box in a remote server farm. So I have no direct access. Only remote SSH (and via SSH to a serial console). I haven't updated this box in ages. Now, whenever I want to install a new package, a dependency to glibc appears. Unfortunately, the install of glibc depends on a 2.6 kernel and I am running a venerable 2.4 kernel (one more reason to upgrade). The problem is, that the install of a new kernel has an indirect (over locales) dependency to glibc. So, to install glibc, I need a new kernel. For a new kernel, I need to upgrade glibc. Essentially I am blocked. What's the best way to proceed considering I have no "hardware" access? Here's a quick transcript of the upgrade process: [green:~]% sudo aptitude install linux-image-686 Reading Package Lists... Done Building Dependency Tree Reading extended state information Initializing package states... Done Reading task descriptions... Done The following packages are unused and will be REMOVED: gcc-4.3-base The following NEW packages will be automatically installed: dash libc6-i686 libparse-recdescent-perl linux-image-2.6-686 linux-image-2.6.18-6-686 module-init-tools yaird The following packages have been kept back: adduser apache2 apache2-mpm-prefork apache2-utils apache2.2-common apt apt-utils aptitude autoconf autotools-dev awstats base-files base-passwd [...snip...] util-linux vacation vim vim-common wamerican wbritish wget whiptail whois wwwconfig-common zlib1g The following NEW packages will be installed: dash libc6-i686 libparse-recdescent-perl linux-image-2.6-686 linux-image-2.6.18-6-686 linux-image-686 module-init-tools yaird The following packages will be upgraded: hotplug libc6 2 packages upgraded, 8 newly installed, 1 to remove and 277 not upgraded. Need to get 0B/22.7MB of archives. After unpacking 52.1MB will be used. Do you want to continue? [Y/n/?] Writing extended state information... Done Preconfiguring packages ... (Reading database ... 34065 files and directories currently installed.) Preparing to replace libc6 2.3.6.ds1-13 (using .../libc6_2.7-18lenny2_i386.deb) ... Checking for services that may need to be restarted... Checking init scripts... WARNING: init script for postgresql not found. [ --- libc6 config screen appears here --- ] WARNING: POSIX threads library NPTL requires kernel version 2.6.8 or later. If you use a kernel 2.4, please upgrade it before installing glibc. The installation of a 2.6 kernel _could_ ask you to install a new libc first, this is NOT a bug, and should *NOT* be reported. In that case, please add etch sources to your /etc/apt/sources.list and run: apt-get install -t etch linux-image-2.6 Then reboot into this new kernel, and proceed with your upgrade dpkg: error processing /var/cache/apt/archives/libc6_2.7-18lenny2_i386.deb (--unpack): subprocess pre-installation script returned error exit status 1 Errors were encountered while processing: /var/cache/apt/archives/libc6_2.7-18lenny2_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1) Ack! Something bad happened while installing packages. Trying to recover: dpkg: dependency problems prevent configuration of locales: locales depends on glibc-2.7-1; however: Package glibc-2.7-1 is not installed. dpkg: error processing locales (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: locales Reading Package Lists... Done Building Dependency Tree Reading extended state information Initializing package states... Done Reading task descriptions... Done Now, if I follow the instrunctions as promted I get the following. Note that I am using aptitude instead of apt-get to benefit from the better dependency tracking. I did try with apt-get first. But that let me to the same problem. [green:~]% sudo aptitude install -t etch linux-image-2.6.26-2-686 Reading Package Lists... Done Building Dependency Tree Reading extended state information Initializing package states... Done Reading task descriptions... Done E: Unable to correct problems, you have held broken packages. E: Unable to correct dependencies, some packages cannot be installed E: Unable to resolve some dependencies! Some packages had unmet dependencies. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following packages have unmet dependencies: linux-image-2.6.26-2-686: Depends: initramfs-tools (>= 0.55) but it is not installable or yaird (>= 0.0.13) but it is not installable or linux-initramfs-tool which is a virtual package. Any ideas?

    Read the article

  • VLC - How to restore GUI?

    - by rism
    I was mucking around with VLC setting up an http interface and somehow I've managed to turn off the default GUI - which i believe is the Qt interface. So now when i start VLC i get the process showing up in TaskManager (Im on Win 7) but no GUI. How can i re-enable GUI without an uninstall|install? I don't want to do this because I want to keep mucking around with it and if i accidentally turn off the GUI again I don't want to have to go through some uninstall|reinstall loop. I thought it would just be a config file entry somewhere but do you think i can find it? The config file that is.

    Read the article

  • Recovering a lost website with no backup?

    - by Jeff Atwood
    Unfortunately, our hosting provider experienced 100% data loss, so I've lost all content for two hosted blog websites: http://blog.stackoverflow.com http://www.codinghorror.com (Yes, yes, I absolutely should have done complete offsite backups. Unfortunately, all my backups were on the server itself. So save the lecture; you're 100% absolutely right, but that doesn't help me at the moment. Let's stay focused on the question here!) I am beginning the slow, painful process of recovering the website from web crawler caches. There are a few automated tools for recovering a website from internet web spider (Yahoo, Bing, Google, etc.) caches, like Warrick, but I had some bad results using this: My IP address was quickly banned from Google for using it I get lots of 500 and 503 errors and "waiting 5 minutes…" Ultimately, I can recover the text content faster by hand I've had much better luck by using a list of all blog posts, clicking through to the Google cache and saving each individual file as HTML. While there are a lot of blog posts, there aren't that many, and I figure I deserve some self-flagellation for not having a better backup strategy. Anyway, the important thing is that I've had good luck getting the blog post text this way, and I am definitely able to get the text of the web pages out of the Internet caches. Based on what I've done so far, I am confident I can recover all the lost blog post text and comments. However, the images that go with each blog post are proving…more difficult. Any general tips for recovering website pages from Internet caches, and in particular, places to recover archived images from website pages? (And, again, please, no backup lectures. You're totally, completely, utterly right! But being right isn't solving my immediate problem… Unless you have a time machine…)

    Read the article

  • How to automatically restart RRAS service after OpenVPN

    - by JT
    I have OpenVPN set up on a Windows Server 2003 box using a routed configuration. This allows users to connect and access the work LAN subnet. There are remote hosts/services however that are only accessible when used via the work network. To enable access these, I push routes out to the clients to make sure traffic to these destinations goes across the VPN, and NAT the traffic using RRAS. This all works, except: if I restart the OpenVPN service, network traffic stops working until I restart the RRAS service as well. Is there a good way for me to make the RRAS service start/restart after OpenVPN? Are service dependencies the way to go? Obviously I could write a batch file to do this, but I'd like to make the process as bullet-proof and obvious as I can so it doesn't cause problems for other admins.

    Read the article

  • PCI compliance when using third-party processing

    - by Moses
    My company is outsourcing the development of our new e-commerce site to a third party web development company. The way they set up our site to handle transactions is by having the user enter the necessary payment info, then passing that data to a third party merchant that processes the payment, then completing the transaction if everything is good. When the issue of PCI/DSS compliance was raised, they said: You wont need PCI certification because the clients browser will send the sensitive information directly to the third party merchant when the transaction is processed. However, the process will be transparent to the user because all interface and displays are controlled by us. The only server required to be compliant is the third party merchant's because no sensitive card data ever touches your server or web app. Even though I very much so trust and respect the knowledge of our web developers, what they are saying is raising some serious red flags for me. The way the site is described, I am sure we will not be using a hosted payment page like PayPal or Google Checkout offers (how could we maintain control over UI if we were?) And while my knowledge of e-commerce is laughable at best, it seems like the only other option for us would be to use XML direct to communicate with our third party merchant for processing. My two questions are as follows: Based off everything you've read, is "XML Direct" the only option they could conceivably be using, or is there another method I don't know of which they could be implementing? Most importantly, is it true our site does not need PCI certification? As I understand it, using the XML direct method means that we do have to be PCI/DSS certified, and the only way around getting certified is through a payment hosted page (i.e. PayPal).

    Read the article

  • Implementing a form of port knocking + Phone Factor = 2 Factor auth for RDP?

    - by jshin47
    I have been looking into how to secure a publicly-available RDP endpoint and want to implement our two-factor authentication RADIUS server, PhoneFactor. I would like to implement the following process: User opens up web app in browser In web app, user enters username + password, initiates RADIUS auth Phone factor calls user to complete auth Once user is authenticated, port 3389 is opened on user's IP on pfSense firewall. After some amount of time, firewall rule is removed for that IP I would like to know the following: Is this a typical setup? If it is a bad idea, please explain why. If it is possible, are there any packages that assist with this? Specifically, the third step, where the appropriate firewall rule would need to be added... Edit: I am aware of TS Web Gateway, but I want the users to be able to use the traditional RDP client...

    Read the article

  • Exalytics Increases Customer Revenue, and Saves Time, Risk & Cost

    - by Mike.Hallett(at)Oracle-BI&EPM
    We are getting some great proof point stories now from our customers who are succeeding with the Exalytics in-memory system for OBI and Essbase.  See below for some recent testimony: San Diego Unified School District Harnesses Attendance, Procurement, and Operational Data with Oracle Exalytics, Generating $4.4 Million in Savings: according to independent assessment by Mainstay Salire, the district is on track to achieve substantial benefits from the Oracle Exalytics solution, including an $8.25 million increase in attendance revenue, $75,000 a year savings in operational efficiencies, and $1 million in hardware cost avoidance. NilsonGroup chooses Oracle Exalytics In-Memory Machine as their solution to access critical data to keep its stores competitive with real-time Mobile BI: it took only “3 days to get up and running” with Exalytics.  Video Nykredit, in the Danish Financial Sector, describes their experiences from testing the Exalytics Business Intelligence Machine: “it was up and running within 4 days” with “more intuitive dashboards” and “up to 70x better performance” and “cheaper maintenance and lower total cost of ownership”. Video Sodexo chose Oracle Exalytics as their business analytics platform; accelerating Essbase “more than 8x” performance for more than 2,000 Excel-addin users, “significantly changing how people in information management now deal with data”.  Video Polk, Savvis, Nykredit, and Key Energy describe testing of the Oracle Exalytics In-Memory Machine: to “reach more users than we ever have before”, “to fly through the data without impeding the analytic process”, “drive our enterprise groups into this tool instead of having departmental solutions”, and the “advanced visualisation this product enables”.  Video

    Read the article

  • CVSNT to CVSNT Migrations

    - by BParker
    Has anybody re-homed a CVSNT installation? I need to take a CVSNT setup on one ageing server and relocate it to a new shiny server, but i don't have a great deal of experience with this software. Does anyone know the basic process for moving an entire repository to a new server? We have 3 base repositories, totaling about 6GB. Size wise it's not too big, but i need to make sure all the history is migrated correctly. I've googled around a bit, but not managed to find any info on this kind of move, everyone seems to prefer to write how-to's on moving to Subversion et al.... The CVSNT server is running on windows if that makes a great deal of difference....

    Read the article

  • Sql Server differential backup : Simple vs Full recovery model

    - by MaxiWheat
    I need to better understand the backup process under SQL Server 2008. Since drive space is a kind of matter for us and we want to have a better disaster recovery solution, I decided that we will implement differential backups throughout the day (every hour). Am I right to think that if I keep the recovery model of my databases to Simple, the differential backup will be almost the same size as Full Backup (too big to make one every hour) ? I already tried to switch to Full recovery and it seemed to have fixed the issue (differential backups were way smaller). I heard that the recovery model must be set to Full to use Log backups (to the minute recovery etc., but we don't need that) but never about differential backups. So, is the recovery model really having an impact on differential backups or am I missing something ? Thank you

    Read the article

  • Where to download the lastest Windows ISO (legally)?

    - by Doon
    I have the dreamspark MS license. However, everytime I have to install Windows on my PC, I get many updates and this process is long and boring. Is there any way to download the lastest Windows ISO with updates? I'm trying to install Windows 7 (but Windows 8 would also be helpful). Note: I don't want just download windows. I want to download the OS with the lastest updates, those that are downloaded with Windows update after installing it!! Thanks!

    Read the article

  • Netsh commands not working on remote computer

    - by Mike Christiansen
    Hello, At work, we are in the process of migrating over 200 computers from static IPs to DHCP. The DHCP server is configured. My biggest hurdle is physically going to every single computer in the area and configuring them all for DHCP. I am trying to use netsh to accomplish this. However, I cannot even seem to set one computer to DHCP remotely. The command I am trying is: netsh -r COMPUTERNAME interface ip set address name="Local Area Connection" source=dhcp netsh -r COMPUTERNAME interface ip set dns name="Local Area Connection" source=dhcp This results in the error The following command was not found: interface ip set address "name=Local Area Connection" source=dhcp. Any ideas?

    Read the article

  • reinstalling phpmyadmin

    - by explorex
    EDIT:: installed mysql-server but no phpmyadmin (since phpmyadmin was installed before mysql, that resulted an error). How to reinstall phpmyadmin with database (there is no phpmyadmin datbase)? unstalling it and reinstalling it didn't help. i was trying to install phpmyadmin (and zend framework) through synaptic manager but in the middle i was prompted for password i thought it was phpmyadmin password and i proceeded but i got error and i aborted. and then again i tried to reinstall, it reinstalled but i am not getting phpmyadmin. EDIT::The following is invalid so please don't bother apache is running but mysql is not some of it's characteristics are: santosh@explorer:~$ mysql ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2) santosh@explorer:~$ sudo apt-get install mysql-server E: Could not get lock /var/lib/dpkg/lock - open (11: Resource temporarily unavailable) E: Unable to lock the administration directory (/var/lib/dpkg/), is another process using it? Should i be asking this question here or stackoverflow? UPDATES:: after restarting my computer santosh@explorer:~$ sudo service mysql start [sudo] password for santosh: mysql: unrecognized service UPDATES:: santosh@explorer:~$ sudo apt-get install mysql Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package mysql

    Read the article

  • NRF Big Show 2011 -- Part 1

    - by David Dorf
    When Apple decided to open retail stores, they came to 360Commerce (now part of Oracle Retail) to help with the secret project. Similarly, when Disney Stores decided to reinvent itself, they also came to us for their POS system. In both cases visiting a store is an experience where sales take a backseat to entertainment, exploration, and engagement This quote from a recent Stores Magazine article says it all: "We compete based on an experience, emotion and immersion like Disney," says Neal Lassila, vice president of global information technology for Disney. "That's opposed to [competing] on price and hawking a doll for $19.99. There is no sales pressure technique." Instead, it's about delivering "a great time." While you're attending the NRF conference in New York next week, you'll definitely want to stop by the new 20,000 square-foot Disney store in Times Square. If you're not attending, you can always check out the videos to get a feel for the stores' vibe. This year we've invited Disney Stores to open a pop-up store within the Oracle Retail booth. There will be lots of items on sale that fit in your suitcase, and there's no better way to demonstrate our POS, including the mobile POS running on an iPod Touch. You should also plan to attend Tuesday morning's super-session The Magic of the Disney Store: An Immersive Retail Experience with Steve Finney. In the case of Apple and Disney, less POS is actually a good thing. In both cases it was important to make the checkout process fast and easy so as not to detract from the overall experience. There will be ample opportunities to see this play out in New York next week, so I hope you take advantage.

    Read the article

  • How do I setup a systemd service to be started by a non root user as a user daemon?

    - by Hans
    I just finished the install and setup process of systemd on my arch-linux system (2012.09.07). I uninstalled initscripts (and removed the configuration files). What I want to do is create a service that can be started and stopped by a non-root user. The service is to start a detached screen session running rtorrent. However I want every user on the system who has set this service to start (enabled) to have a particular instance started for them specifically. How would one go about doing this? I remember reading that systemd supports user instances of services, however I have been unable to find any information on how to set this up, or whether it relates to what I am looking for. Service file that I have used for system: [Unit] Description=rTorrent [Service] Type=forking ExecStart=/usr/bin/screen -d -m -S rtorrent /usr/bin/rtorrent ExecStop=/usr/bin/killall -w -s 2 /usr/bin/rtorrent

    Read the article

  • Problems with icons and shutting down Ubuntu 12.04

    - by Anders
    I recently upgraded from 11-10 to 12.04 as a dual boot (Windows 7) and am having problems with shutting down. I installed from a CD that I burned from the iso file, and to get it to boot from the CD, I needed to install the special help program from Windows that would then recognize the CD as the system booted.The installation gave me my own user account as well as a guest account. For the User account there is no "gear" icon (in the upper RH side) from which to access the shut down menu. Interestingly the icons for the Home folder, the Ubuntu One folder, and the System Settings folder are missing, although there are blank places shows these names if the mouse cursor is positioned over these areas. They will even launch with the press of a key - but no icons are visible. For the Guest account, all of these icons are visible and usable. The problem that occurs in shutting down is that I need to leave the User account, move to the Guest account (so that I can access the top right "gear" icon that has the shut down menu item) and press the shut down button. The problem here is that when the shut own menu appears and I press the confirmation that I wish to shut down the computer, the page blanks (as one might hope in the process of closing down), and then pops up with the log in menu, giving the option of logging in as a User or as a Guest. So the questions are: 1) how do you reinstate the far right top icon from which you can access the shut down drop down menu in the User account page. 2) how do you get the icons to display properly on the Left hand side (Home, Ubuntu One, System Settings, and Workspace Switcher) 3) how do you get the system to turn of when you press the shut down button! Boy oh boy! Thanks a bunch for any help!

    Read the article

< Previous Page | 537 538 539 540 541 542 543 544 545 546 547 548  | Next Page >