Search Results

Search found 18976 results on 760 pages for 'ash machine'.

Page 167/760 | < Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >

  • What's new with Java technology? Java Embedded

    - by hinkmond
    As this article points out, Java Embedded is a safer, more robust and easier to develop platform for small networked devices. So, get ready for good things to come from Java Embedded... See: Java Embedded: Next New Thing Here's a quote: Through the past few years the industry as we know it has seen a big boom with the mobile and cloud revolution. Today, there has been an enormous amount of buzz around machine to machine (M2M) or the "Internet of Things," since we are moving into a state where everything is going to have to be interconnected and will have to properly communicate together... Today, Java Embedded provides that platform. I like it! As long as there's no Zombie Apocalypse, I think Java Embedded has a great future! Hinkmond

    Read the article

  • Headset undetected when plugged in

    - by tough
    I have recently installed Ubuntu 12.04 in my machine which was running windows 7. I have been trying to configure the audio to work exactly as it used to work in Windows but never been able to do so. I have followed this link exactly. I am still not getting the required configuration. aslamixer command shows me with 5 adjustable controls as shown below Master "adjustable" Speaker "adjustable" PCM "adjustable" Front "adjustable" AND Beep "adjustable" Mic Jack Mic In or Lin In S/PDIF OO "in a box" S/PDIF D OO "in a box" S/PDIF P DIGITAL or Analog M It does not detect the headset jack when plugged in. I here mean to say that the sound form the speakers does not go off when I plug in my headset jack. How can I make this working. Some other googling also did not help. I am on Hp Pavilion DV7 machine. The chip is IDT 92HD75B3X5 and the card is HDA ATI SB.

    Read the article

  • LibreOffice english spelling dictionary missing

    - by rossouwap
    I've got two machines, same OS (Ubuntu 11.10 x86_64), same LibreOffice ppa's (ppa:libreoffice/ppa). One has the "English spelling dictionaries, hyphenation rules, thesaurus..." extension in the extension manager, the other doesn't. Each upgraded using the ppa from 3.5.0 to 3.5.1. Can anyone provide some insight as to how to get this extension onto the second machine? I can remove LibreOffice from the second machine and install the packages from the LibreOffice site, but would prefer to keep the ppa - as I don't then need to remember to upgrade.

    Read the article

  • C# - How to detect all IP addresses from a LAN?

    - by SAMIR BHOGAYTA
    string strHostName = string.Empty; cmbIPAddress.Items.Clear(); // Getting Ip address of local machine... // First get the host name of local machine. strHostName = Dns.GetHostName(); // Then using host name, get the IP address list.. IPHostEntry ipEntry = Dns.GetHostByName(strHostName); IPAddress[] iparrAddr = ipEntry.AddressList; if (iparrAddr.Length 0) { for (int intLoop = 0; intLoop cmbIPAddress.Items.Add(iparrAddr[intLoop].ToString()); }

    Read the article

  • Virtualisation : deux nouveaux outils pour Windows Server 2008 R2 et un changement majeur dans le XP

    Virtualisation : deux nouveaux outils pour Windows Server 2008 R2 Et un changement majeur dans le XP Mode de Windows 7 Microsoft vient d'annoncer deux futurs Service Pack pour Windows 7 et Windows Server 2008 R2. Le SP1 de ce dernier introduira deux nouveaux outils de virtualisation : Microsoft RemoteFX and Dynamic Memory. Le SP1 de Windows 7 pour sa part ne proposera que les mises à jour effectuées régulièrement via Windows Update. Dybamic Memory est une amélioration de la technologie Hyper-V qui permet aux administrateurs informatiques de mettre en commun toute la mémoire disponible sur une machine hôte et de la distribuer dynamiquement aux machine...

    Read the article

  • Installed LibreOffice 4 with ppa, how do I remove it and go back to LibreOffice 3?

    - by MMA
    EDIT This question is not at all a duplicate of How to downgrade from LibreOffice 4.0 to 3.6? The above mentioned question talks about downgrading from a specific version of LibreOffice, namely from 4.0 to 3.6. The solutions mentioned are not the ones I am looking for. They will work but I wanted a general solution without using PPA or downloading .deb files for from a higher version to a lower version. The above solutions suggest either downloading .deb files for LibreOffice 3.6 or adding repository for it. Furthermore, some of the answers put out-of-proportion~(applicable for the solution, however) stress on use of synaptic, not general command-line-solution. That made me wonder, at this very moment, if I take a fresh computer, and install Ubuntu 12.04, LibreOffice installation will work without a hitch. Then why I can not install LibreOffice in my 12.04 machine today from simple command line? This answer to my question, clarified everything. I need to use ppa-purge so that this resets all packages from a PPA to the standard versions released for my distribution. Basically it is like a way to restore my system back to the way it was before my installed packages from a PPA. This article further elaborates the idea. The above mentioned answer worked perfectly for me. Actually, this was an education for me since it taught me how do downgrade a package that was added via PPA. I had upgraded from LibreOffice 3 to LibreOffice 4 using the PPA. Now since I found that LibreOffice 4 has some issues, including handling my native language, I want to move back to LibreOffice 3. In order to accomplish this, I removed the LibreOffice config directory from my home and then purged LibreOffice from my machine. sudo apt-get purge libreoffice-* Then I removed the relevant PPA's using the sudo apt-add-repository --remove command. And then ran sudo apt-get update. Now, when I try to install LibreOffice using the command sudo apt-get install libreoffice I get an avalanche of output about unmet dependencies, something like, The following packages have unmet dependencies: libreoffice : Depends: libreoffice-core (= 1:3.5.7-0ubuntu4) but it is not going to be installed (snipped) If I dig the issue further, by using the command, sudo apt-get install libreoffice-core I get The following packages have unmet dependencies: libreoffice-core : Depends: libreoffice-common (> 1:3.5.7) but it is not going to be installed Depends: libexttextcat0 (>= 2.2-8) but it is not going to be installed Depends: ure (>= 3.5.7~) but it is not going to be installed E: Unable to correct problems, you have held broken packages. Could you please tell me how do I install LibreOffice 3 in my machine? I am using Ubuntu 12.04 LTS.

    Read the article

  • Problems with EmusicJ on Ubuntu 14.04

    - by Michael Dykes
    I am running Ubuntu 14.04 in my new machine and have a subscription to emusic (because I used it before in my Windows days because of the great selection of classical music). I have used it on my old machine (running older versions of Ubuntu) with some success but am having difficulties doing so now. I am trying to use EmusicJ but do not/cannot remember how to do so. I have downloaded the program and unzipped it and ran the following command in a terminal: ./emusic If I remember correctly, once I go to the emusic site and download my music, the EmucisJ program should open and begin downloading the music through that program but this time it is not and I have already wiped my old box in order to give it to a friend. Any help is appreciated as I would like to continue with Emusic, but if I cannot get this resolved I will likely have to cancel my subscription and find a better alternative. Thanks.

    Read the article

  • How do I calculate a SWAP partition?

    - by Dean Howell
    Since the late 90s I've always understood that it is best practice to allocate twice the amount of physical RAM as swap space. I just received my new laptop in the mail and it came with 6GB of RAM. In a separate order I had 16GB of RAM to replace the preinstalled. I didn't have the right torx driver to get to the RAM in this machine, so I installed Ubuntu and manually set a 16GB partition for swap. 32GB seemed a tad excessive... Did I make the right choice? Would my machine perform better if there was no SWAP at all?

    Read the article

  • How to disable DisplayInfo for Intel i810 when there is no xorg.conf

    - by user86954
    I'm testing Ubuntu 12.04 on an office desktop. This machine has an integrated Intel i810. This very machine used to run Ubuntu (or kubuntu) quite well, until the natty narwhal release, when the xserver started freezing. This seems to be caused by an issue with a call to the video bios, which can be circumvented by disabling the option DisplayInfo. I tried creating an xorg.conf by using Xorg -configure, but it produced an error after trying to make a file. I did have a file, and I put it in /etc/X11 with DisplayInfo false (maybe it wants "disabled"?), but it wouldn't boot at all until I removed that xorg.conf. There is no 10-monitor.conf file either. All I want to do is set the appropriate option so that my intel display doesn't freeze. Is there any other way I can disable the DisplayInfo? Andrew

    Read the article

  • Why are KDE components being installed/updated on 12.04 with GNOME?

    - by Dune
    I am not yet fully versed with the components installed by default on my machine, so I will apologize in advance if my question is silly. shows that a lot many of (what I assume are) KDE components (libk*, kde*, etc.) are being installed/updated on my machine. That is just the output from sudo apt-fast update && sudo apt-fast dist-upgrade -y from a few minutes ago. Can anyone tell me why? Can I safely remove them? If yes, how? Thanks in advance for any replies. System specs: Fully updated Ubuntu 12.04 x86_64 w/kernel3.4, Gnome, Unity, Core2Duo, 4GB

    Read the article

  • Minimum to install for a visual web browser in Ubuntu Server

    - by Svish
    I have set up a machine with Ubuntu Server. Some of the server software I want to run on it has web based user interfaces for setting it up et cetera. I know I could connect to it from a different machine which has a graphical user interface, but in this case I would rather do it on the box. So, from a fresh Ubuntu Server installation, what is the minimum I need to install to be able to launch a web browser I can use for this? For example chromium, firefox or arora.

    Read the article

  • How To Enable 3D Acceleration and Use Windows Aero in VirtualBox

    - by Chris Hoffman
    VirtualBox’s experimental 3D acceleration allows you to use Windows 7’s Aero interface in a virtual machine. You can also run older 3D games in a virtual machine – newer ones probably won’t run very well. If you installed Windows 7 in VirtualBox, you may have been disappointed to see the Windows 7 Basic interface instead of Aero – but you can enable Aero with a few quick tweaks. How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • System freezes while not in use, how do I fix this?

    - by PHLAK
    Bare with me, the following is a bit winded. I have Ubuntu 10.10 Desktop 64-bit installed on my laptop and up until a few weeks ago it has been running great. Then one day, while I was not using the laptop it froze. I was logged in as my user but had locked the screen locked and closed the lid. I didn't notice that it had frozen until I opened the lid and wiggled the mouse to try and log in. The screen remained black and I got no response. I immediately tried Alt + F2, F3, F4, etc. but got no response. The only thing I could do was hold the power button to power off the machine. The freezing has happened as quickly as within 10-20 minutes of the system being logged off and lid closed and as long as 4-6 hours. My machine is NOT configured to go into standby when plugged in and this has happened both on AC power and battery. Troubleshooting I have performed: I uninstalled programs I knew that I had installed between when it was working fine and having problems. Those programs were CrashPlan, Shutter and Conky. After uninstalling ALL of these programs the freezing still occurs. Next, I decided to SSH into the machine from my desktop and leave an htop and tail of the syslog running. Here are screenshots of the last thing shown on both when the system froze: htop, syslog Here is a dump of my syslog after another freeze. The freeze happened at 9:14 and I didn't notice it until about 10 minutes later and rebooted, hence the 10 minute gap from 9:14 to 9:24. In the above syslog dump I noticed a lot of NVRM: os_raise_smp_barrier(), invalid context! and upon investigating that message learned it was from the proprietary Nvidia driver I had installed. Thinking this could be part of the problem I uninstalled the Nvidia driver and reverted to using the Nouveau driver. The computer still froze after a few hours. Lastly, thinking the problem could be caused by overheating I used compressed air to blow out any dust in the CPU vents and all other openings on the laptop. None of the above troubleshooting has helped and the freezing still occurs. What other steps can I take to troubleshoot and/or fix this problem? Note: Yesterday X started to eat up a lot of CPU power and eventually froze my system while I was forwarding an X session over SSH (from another PC to my laptop). I'm unsure if this is related or not as it doesn't match any of the symptoms of the problem above. Aside from this, the system has never frozen while in use, even under heavy load. EDIT: I just ran Memtest86+ and it made it through two passes without any errors. Just eliminating possible causes here.

    Read the article

  • Boot-Repair after messing up NTFS partition

    - by QuietThud
    I posted this question explaining what happened after I tried to create a new swap partition on a Win/Ubuntu dual-boot machine. I have since created a live-boot USB and installed Boot-Repair. I had it "recommend repairs", after which it tried repairing the wubi filesystems, which as far as I'm aware was not necessary. I'm not sure where to go from here; I don't care very much about backing up my files, I just want to be able to boot the machine. In the Advanced Options the "Repair Windows boot files" box is uncheckable both GRUB tabs are unclickable (I do have GRUB installed) Here is my Pastebinit with the details from the Boot-Repair. Please be as explicit as possible, as I am proving to be disproportionately bad at this type of task. Thank you! P.S. I keep seeing: cryptsetup: WARNING: failed to detect canonical device of overlayfs cryptsetup: WARNING: could not determine root device from /etc/fstab TestDisk: GPARTED:

    Read the article

  • Does 64-bit Ubuntu work on the Acer Aspire One D255

    - by hippietrail
    The Acer Aspire One D255 is the cheapest dual core netbook on the market right now. It has an Intel Atom N550 which should be able to run a 64-bit OS. But when I try to boot the Ubuntu 64-bit live CD I only get one line of diagnostic output that it "found something" on the USB CD drive before locking up. I haven't been able to find anything by Googling yet. Could it just be driver issues for this machine or could the platform be inherently frail for running 64-bit? (My machine is two days old on trial and Windows 7 and Ubuntu 32-bit run but it has locked up under casual use on both OSes.)

    Read the article

  • JPRT: A Build & Test System

    - by kto
    DRAFT A while back I did a little blogging on a system called JPRT, the hardware used and a summary on my java.net weblog. This is an update on the JPRT system. JPRT ("JDK Putback Reliablity Testing", but ignore what the letters stand for, I change what they mean every day, just to annoy people :\^) is a build and test system for the JDK, or any source base that has been configured for JPRT. As I mentioned in the above blog, JPRT is a major modification to a system called PRT that the HotSpot VM development team has been using for many years, very successfully I might add. Keeping the source base always buildable and reliable is the first step in the 12 steps of dealing with your product quality... or was the 12 steps from Alcoholics Anonymous... oh well, anyway, it's the first of many steps. ;\^) Internally when we make changes to any part of the JDK, there are certain procedures we are required to perform prior to any putback or commit of the changes. The procedures often vary from team to team, depending on many factors, such as whether native code is changed, or if the change could impact other areas of the JDK. But a common requirement is a verification that the source base with the changes (and merged with the very latest source base) will build on many of not all 8 platforms, and a full 'from scratch' build, not an incremental build, which can hide full build problems. The testing needed varies, depending on what has been changed. Anyone that was worked on a project where multiple engineers or groups are submitting changes to a shared source base knows how disruptive a 'bad commit' can be on everyone. How many times have you heard: "So And So made a bunch of changes and now I can't build!". But multiply the number of platforms by 8, and make all the platforms old and antiquated OS versions with bizarre system setup requirements and you have a pretty complicated situation (see http://download.java.net/jdk6/docs/build/README-builds.html). We don't tolerate bad commits, but our enforcement is somewhat lacking, usually it's an 'after the fact' correction. Luckily the Source Code Management system we use (another antique called TeamWare) allows for a tree of repositories and 'bad commits' are usually isolated to a small team. Punishment to date has been pretty drastic, the Queen of Hearts in 'Alice in Wonderland' said 'Off With Their Heads', well trust me, you don't want to be the engineer doing a 'bad commit' to the JDK. With JPRT, hopefully this will become a thing of the past, not that we have had many 'bad commits' to the master source base, in general the teams doing the integrations know how important their jobs are and they rarely make 'bad commits'. So for these JDK integrators, maybe what JPRT does is keep them from chewing their finger nails at night. ;\^) Over the years each of the teams have accumulated sets of machines they use for building, or they use some of the shared machines available to all of us. But the hunt for build machines is just part of the job, or has been. And although the issues with consistency of the build machines hasn't been a horrible problem, often you never know if the Solaris build machine you are using has all the right patches, or if the Linux machine has the right service pack, or if the Windows machine has it's latest updates. Hopefully the JPRT system can solve this problem. When we ship the binary JDK bits, it is SO very important that the build machines are correct, and we know how difficult it is to get them setup. Sure, if you need to debug a JDK problem that only shows up on Windows XP or Solaris 9, you'll still need to hunt down a machine, but not as a regular everyday occurance. I'm a big fan of a regular nightly build and test system, constantly verifying that a source base builds and tests out. There are many examples of automated build/tests, some that trigger on any change to the source base, some that just run every night. Some provide a protection gateway to the 'golden' source base which only gets changes that the nightly process has verified are good. The JPRT (and PRT) system is meant to guard the source base before anything is sent to it, guarding all source bases from the evil developer, well maybe 'evil' isn't the right word, I haven't met many 'evil' developers, more like 'error prone' developers. ;\^) Humm, come to think about it, I may be one from time to time. :\^{ But the point is that by spreading the build up over a set of machines, and getting the turnaround down to under an hour, it becomes realistic to completely build on all platforms and test it, on every putback. We have the technology, we can build and rebuild and rebuild, and it will be better than it was before, ha ha... Anybody remember the Six Million Dollar Man? Man, I gotta get out more often.. Anyway, now the nightly build and test can become a 'fetch the latest JPRT build bits' and start extensive testing (the testing not done by JPRT, or the platforms not tested by JPRT). Is it Open Source? No, not yet. Would you like to be? Let me know. Or is it more important that you have the ability to use such a system for JDK changes? So enough blabbering on about this JPRT system, tell me what you think. And let me know if you want to hear more about it or not. Stay tuned for the next episode, same Bloody Bat time, same Bloody Bat channel. ;\^) -kto

    Read the article

  • Sound notification over SSH

    - by Lekensteyn
    I just switched from the Konversation IRC client to the terminal based IRSSI. I'm starting IRSSI on a remote machine using GNU screen + SSH. I do not get any sound notification on new messages, which means that I've to check out IRSSI once in a while for new messages. That's not really productive, so I'm looking for an application / script that plays a sound (preferably /usr/share/sounds/KDE-Im-Irc-Event.ogg and not the annoying beep) on my machine if there is any activity. It would be great if I can disable the notification for certain channels. Or, if that's not possible, some sort of notification via libnotify, thus making it available to GNOME and KDE.

    Read the article

  • How do I shutdown 12.10 when ACPI is set to OFF?

    - by rahi
    I recently installed 12.04 on an HP Spectre XT ultrabook. I installed 12.04 from a USB. When the install was complete, and the computer restarted, I got a blank screen. After some searching, I found that I had to choose "acpi=off" during the installation, which now shows up in the grub screen. Once I was on 12.04, I upgraded to 12.10. The issue now is that when I hit shutdown, the computer hangs on a purple shutdown screen. Usually, at this point, I press the power button on the machine to completely power off the machine. I am new to ubuntu, and would appreciate the help. Thanks.

    Read the article

  • Ubuntu 13.10 install VMware 9.0

    - by user212290
    After I install the VMware workstation 9.0, while when I want open the VM, there come the dialogue "Before you can run VMware, several modules must be complied and loaded into the running kernel CANCEL INSTALL",while I clicked the INSTALL button, nothing happened. When: sudo apt-get install linux-headers-3.11.0-12-generic sudo /usr/bin/vmware-modconfig --icon=vmware-workstation --appname=VMware come: cc1: some warnings being treated as errors make[2]: *** [/tmp/modconfig-T9k19t/vmci-only/linux/driver.o] Error 1 make[2]: *** Waiting for unfinished jobs.... make[1]: *** [_module_/tmp/modconfig-T9k19t/vmci-only] Error 2 make[1]: Leaving directory `/usr/src/linux-headers-3.11.0-12-generic' make: *** [vmci.ko] Error 2 make: Leaving directory `/tmp/modconfig-T9k19t/vmci-only' Failed to build vmci. Failed to execute the build command. Starting VMware services: Virtual machine monitor done Virtual machine communication interface failed VM communication interface socket family done Blocking file system done Virtual ethernet failed VMware Authentication Daemon done

    Read the article

  • Why is multithreading often preferred for improving performance?

    - by user1849534
    I have a question, it's about why programmers seems to love concurrency and multi-threaded programs in general. I'm considering 2 main approaches here: an async approach basically based on signals, or just an async approach as called by many papers and languages like the new C# 5.0 for example, and a "companion thread" that manages the policy of your pipeline a concurrent approach or multi-threading approach I will just say that I'm thinking about the hardware here and the worst case scenario, and I have tested this 2 paradigms myself, the async paradigm is a winner at the point that I don't get why people 90% of the time talk about multi-threading when they want to speed up things or make a good use of their resources. I have tested multi-threaded programs and async program on an old machine with an Intel quad-core that doesn't offer a memory controller inside the CPU, the memory is managed entirely by the motherboard, well in this case performances are horrible with a multi-threaded application, even a relatively low number of threads like 3-4-5 can be a problem, the application is unresponsive and is just slow and unpleasant. A good async approach is, on the other hand, probably not faster but it's not worst either, my application just waits for the result and doesn't hangs, it's responsive and there is a much better scaling going on. I have also discovered that a context change in the threading world it's not that cheap in real world scenario, it's in fact quite expensive especially when you have more than 2 threads that need to cycle and swap among each other to be computed. On modern CPUs the situation it's not really that different, the memory controller it's integrated but my point is that an x86 CPUs is basically a serial machine and the memory controller works the same way as with the old machine with an external memory controller on the motherboard. The context switch is still a relevant cost in my application and the fact that the memory controller it's integrated or that the newer CPU have more than 2 core it's not bargain for me. For what i have experienced the concurrent approach is good in theory but not that good in practice, with the memory model imposed by the hardware, it's hard to make a good use of this paradigm, also it introduces a lot of issues ranging from the use of my data structures to the join of multiple threads. Also both paradigms do not offer any security abut when the task or the job will be done in a certain point in time, making them really similar from a functional point of view. According to the X86 memory model, why the majority of people suggest to use concurrency with C++ and not just an async approach ? Also why not considering the worst case scenario of a computer where the context switch is probably more expensive than the computation itself ?

    Read the article

  • System Slow After Uprading Ubuntu

    - by Aragon N
    i have an ubuntu network machine which has release of 10.04.1 LTS Lucid. on this system i have apache, postgresql and django. for some app. development i have to install php and php-curl... due to being on network, i have exported wmvare machine to internet and firstly i have upgraded system and then install php5 packages on it. After all replacing it with its old place, i have considered that the new system query is some slow according to another. Old system query time : 140 ms New system query time : 9.11 s i have checked /etc/network interface and it seems there is no problem. i have checked /etc/resolv.conf and it seems ok i have checked /etc/nsswitch.conf and only host section is different from old one which old system has hosts: files mdns4_minimal [NOTFOUND=return] dns mdns4 and then i have checked time host -t A services.myapp.com and i got real 0m0.355s user 0m0.010s sys 0m0.020s and now what can i have to check for boosting my system as before?

    Read the article

  • Parallel Port Problem in 12.04

    - by Frank Oberle
    I have a “dumb” printer attached to a parallel port in my machine which works fine under the “other” resident operating system (from Redmond) on the same machine. I recently added Ubuntu 12.04 as a dual boot on the machine, but Ubuntu doesn't seem to recognize the parallel port at all. All I need to set up a printer is a really plain-vanilla fixed pitch text-only generic driver, which is present, but no parallel ports show up. (The other printers, all on USB ports, seem to work just fine). Following what appeared to me to be the most reasonable of the many conflicting pieces of advice on the web, here's what I did: I added the following lines to /etc/modules parport_pc ppdev parport Then, after rebooting, I checked to see that the lines were still present, and they were. I ran dmesg | grep par and got the following references in the output that seemed like they might have to do with the parallel port: [ 14.169511] parport_pc 0000:03:07.0: PCI INT A -> GSI 21 (level, low) -> IRQ 21 [ 14.169516] PCI parallel port detected: 9710:9805, I/O at 0xce00(0xcd00), IRQ 21 [ 14.169577] parport0: PC-style at 0xce00 (0xcd00), irq 21, using FIFO [PCSPP,TRISTATE,COMPAT,ECP] [ 14.354254] lp0: using parport0 (interrupt-driven). [ 14.571358] ppdev: user-space parallel port driver [ 16.588304] type=1400 audit(1347226670.386:5): apparmor="STATUS" operation="profile_load" name="/usr/lib/cups/backend/cups-pdf" pid=964 comm="apparmor_parser" [ 16.588756] type=1400 audit(1347226670.386:6): apparmor="STATUS" operation="profile_load" name="/usr/sbin/cupsd" pid=964 comm="apparmor_parser" [ 16.673679] type=1400 audit(1347226670.470:7): apparmor="STATUS" operation="profile_load" name="/usr/lib/lightdm/lightdm/lightdm-guest-session-wrapper" pid=1010 comm="apparmor_parser" [ 16.675252] type=1400 audit(1347226670.470:8): apparmor="STATUS" operation="profile_load" name="/usr/lib/telepathy/mission-control-5" pid=1014 comm="apparmor_parser" [ 16.675716] type=1400 audit(1347226670.470:9): apparmor="STATUS" operation="profile_load" name="/usr/lib/telepathy/telepathy-*" pid=1014 comm="apparmor_parser" [ 16.676636] type=1400 audit(1347226670.474:10): apparmor="STATUS" operation="profile_replace" name="/usr/lib/cups/backend/cups-pdf" pid=1015 comm="apparmor_parser" [ 16.677124] type=1400 audit(1347226670.474:11): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/cupsd" pid=1015 comm="apparmor_parser" [ 1545.725328] parport0: ppdev0 forgot to release port I have no idea what any of that means, but the line “parport0: ppdev0 forgot to release port ” seems unusual. I was still unable to add a printer for my old clunker, so I tried the direct approach, typing echo “Hello” > /dev/lp0 and received a Permission denied message. I then tried echo “Hello” > /dev/parport0 which didn't give me any message at all, but still didn't print anything. Running the command sudo /usr/lib/cups/backend/parallel gives the following: direct parallel:/dev/lp0 "unknown" "LPT #1" "" "" Checking the permissions for /dev/parport0, Owner, Group, and Other are all set to read and write. crw-rw---- 1 root lp 6, 0 Sep 9 16:37 /dev/lp0 crw-rw-rw- 1 root lp 99, 0 Sep 9 16:37 /dev/parport0 The output of the command lpinfo -v includes the following line: direct parallel:/dev/lp0 I've read several web postings that seem to suggest this has been a problem for several years, but the bug reports were closed because there wasn't enough information to address the issue (shades of Microsoft!). Any suggestions as to what I might be missing here?

    Read the article

  • How to setup passwordless SSH access for root user

    - by Cerin
    I need to configure a machine so software installation can be automated remotely via SSH. Following the wiki, I was able to setup SSH keys so my user can access the machine without a password, but I still need to manually enter my password when I use sudo, which obviously an automated process shouldn't have to do. Although my /etc/ssh/sshd_config has PermitRootLogin yes, I can't seem to be able to login as root, presumably because it's not a "real" account with a separate password. How do I configure SSH keys, so a process can remotely login as root on Ubuntu?

    Read the article

  • Is it safe to convert Windows file paths to Unix file paths with a simple replace?

    - by MxyL
    So for example say I had it so that all of my files will be transferred from a windows machine to a unix machine as such: C:\test\myFile.txt to {somewhere}/test/myFile.txt (drive letter is irrelevant at this point). Currently, our utility library that we wrote ourselves provides a method that does a simple replace of all back slashes with forward slashes: public String normalizePath(String path) { return path.replaceAll("\\", "/"); } Slashes are reserved and cannot be part of a file name, so the directory structure should be preserved. However, I'm not sure if there are other complications between windows and unix paths that I may need to worry about (eg: non-ascii names, etc)

    Read the article

< Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >