Search Results

Search found 18333 results on 734 pages for 'temporary directory'.

Page 530/734 | < Previous Page | 526 527 528 529 530 531 532 533 534 535 536 537  | Next Page >

  • How can I copy the output from a remote command into the local clipboard?

    - by cwd
    I use iTerm2 as my terminal client in Mac OS X. On the local system I can use pbcopy and pbpaste to transfer data between the system clipboard and the terminal, but of course this doesn't work when you're ssh'ed to another machine. Is there some way which I can take the result of a command and copy it to the clipboard automatically? Perhaps an applescript to grab the text on the iTerm windows, then get the next to last line? For instance, if I wanted to copy the current working directory: I run pwd, then use the mouse to select the text, and then press command + c. Is there any better / faster / automatic way of doing this? I'm not looking for a bulletproof solution that would work for every command (eg: might not work when there is a huge scrollback) - I'm just looking for something to make this task that I do quite often a little less tedious. Update I'm looking into using screen to do this, but I'm still not sure if it is possible.

    Read the article

  • Folder default ACLs not inherited when new file is created

    - by Flavien
    I'm a bit of a beginner with Unix systems, but I'm running Cygwin on my Windows Server, and I am trying to figure out something related to extended ACLs. I have a directory to which I set the following ACLs: Administrator@MyServer ~ $ setfacl -m d:u:Someuser:r-- somedir Administrator@MyServer ~ $ getfacl somedir/ # file: somedir/ # owner: Administrator # group: None user::rwx group::r-x mask:rwx other:r-x default:user::rwx default:user:Someuser:r-- default:group::r-x default:mask:rwx default:other:r-x As you can see mose of the default ACLs have the x bit. Then when I create a fine in it, it doesn't inherit the ACLs it is supposed to: Administrator@MyServer ~ $ touch somedir/somefile Administrator@MyServer ~ $ getfacl somedir/somefile # file: somedir/somefile # owner: Administrator # group: None user::rw- user:Someuser:r-- group::r-- mask:rwx other:r-- It's basically missing the x bit everywhere. Any idea why?

    Read the article

  • Sound card not detected in 13.04

    - by Ganessh Kumar R P
    I have a problem with my sound card. I don't have volume up or down option anywhere. In the setting -> Sound I don't have any card detected. But when I run the command sudo aplay -l, I get the following output **** List of PLAYBACK Hardware Devices **** Failed to create secure directory (/home/ganessh/.config/pulse): Permission denied card 0: MID [HDA Intel MID], device 0: STAC92xx Analog [STAC92xx Analog] Subdevices: 0/1 Subdevice #0: subdevice #0 card 1: NVidia [HDA NVidia], device 3: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 card 1: NVidia [HDA NVidia], device 7: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 card 1: NVidia [HDA NVidia], device 8: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 card 1: NVidia [HDA NVidia], device 9: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 And the command lspci -v | grep -A7 -i "audio" outputs 00:1b.0 Audio device: Intel Corporation 5 Series/3400 Series Chipset High Definition Audio (rev 06) Subsystem: Dell Device 02a2 Flags: bus master, fast devsel, latency 0, IRQ 48 Memory at f0f20000 (64-bit, non-prefetchable) [size=16K] Capabilities: <access denied> Kernel driver in use: snd_hda_intel 00:1c.0 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 1 (rev 06) (prog-if 00 [Normal decode]) -- 02:00.1 Audio device: NVIDIA Corporation GF106 High Definition Audio Controller (rev a1) Subsystem: Dell Device 02a2 Flags: bus master, fast devsel, latency 0, IRQ 17 Memory at d3efc000 (32-bit, non-prefetchable) [size=16K] Capabilities: <access denied> Kernel driver in use: snd_hda_intel 07:00.0 Network controller: Intel Corporation Ultimate N WiFi Link 5300 So, I assume that the drivers are properly installed but still I don't get any option in the settings or volume control. The same card used to work well back in 2010 versions(04 and 10) Any help is appreciated. Thanks

    Read the article

  • Where are Wireless Profiles stored in Ubuntu

    - by LonnieBest
    Where does Ubuntu store profiles that allow it to remember the credentials to private wireless networks that it has previously authenticate to and used? I just replaced my Uncle's hard drive with a new one and installed Ubuntu 10.04 on it (he had Ubuntu 9.10 on his old hard drive. He is at my house right now, and I want him to be able to access his private wireless network when he gets home. Usually, when I upgrade Ubuntu, I have his /home directory on another partition, so his wireless profile to his own network persists. However, right now, I'm trying to figure out which .folder I need to copy from his /home/user folder on the old hard drive, to the new hard drive, so that he will be able to have wireless Internet when he gets home. Does anyone know with certainty, exactly which folder I need to copy to the new hard drive to achieve this?

    Read the article

  • Seeking a solution to automatically copy files from the cd-rom disk to the USB drive once it's connected.

    - by Ray Nathan
    I plan to distribute a free CD that automatically copies files to a connected usb device. This process will be done on the computers of the users that obtain the cd. The CD will contain an autorun.ini file that will instruct the computer to copy a set of files located on the cd..to a specific directory on the connected usb device. The usb drive letter is not the same on all the systems, therefore...Windows XP should automatically know the drive letter of the usb device before the copy operation begins. What would be the best way of creating a short batch file or script that I can place on the CD to execute this process? Also, please note that it is NOT feasible or recommended to include a batch file on the USB devices to sync this operation due to the explanation at the beginning of this paragraph. :) Thank You All

    Read the article

  • How do I restore the default applets to Gnome's notification area?

    - by gbacon
    I have a fresh install of Karmic Koala. In a botched attempt at trying to change my default window manager, I somehow removed at least three applets from the notification area: network manager (nm-applet), volume control (gnome-volume-control-applet), and the battery meter (???). Now if I logout and back in, these applets don't run, but I can start them from the command line. Because it's a fresh install, I completely removed my luser account and home directory. After recreating my account, I was frustrated to find that the applets are still missing and no obvious way to add them back. How can I restore the default configuration?

    Read the article

  • Looking for a central image database and tagging system for a group of users

    - by jstarek
    I'm doing IT support for a small volunteer organisation who needs to centrally store and organize around 2500 photos. Can anyone here recommend a database or similar system which matches the following criteria: Intuitive to use for users with little computer experience Multi-user support, ideally with integration in our existing LDAP user directory Should have a web-interface Not a hosted solution like Picasa (because we have a rather slow internet connection with very slow upload) Should allow tagging of images, sorting by various criteria and storing copyright information If there are native GNOME and/or Windows clients for the tool, that would be a great benefit. Many thanks in advance!

    Read the article

  • Insert Hyperlink via VBA

    - by Martin
    I have a Word VBA macro that loops through a directory and writes down the file path of files selected for some criteria into a new Word document. Works well as plain text (as part of a loop): wdDocResults.Content.InsertAfter objFile.Path & Chr(13) However, I'd like them to be hyperlinks. The following works as single macro, but when called from within another script, it does nothing at all (no matter if path is provided as variable or string, or as H:... or \\MyServernameAsNetDrive...): ActiveDocument.Hyperlinks.Add Anchor:=Selection.Range, Address:= objFile.Path, _ SubAddress:="", ScreenTip:="", TextToDisplay:=objFile.Path If try to select the current line in order to make sure something is selected at the right place -- error: out of memory": wrdDocResults.Content.InsertAfter objFil.Path Selection.Expand wdLine ActiveDocument.Hyperlinks.Add Anchor:=Selection.Range, Address:= objFile.Path, _ SubAddress:="", ScreenTip:="", TextToDisplay:=objFile.Path I also tried inserting a string resembling the Hyperlink field code ({ Hyperlink "..." }, which is of course not recognized... Any help is appreciated... Thanks in advance!

    Read the article

  • How do I unmount a charging device in Linux?

    - by PeanutsMonkey
    I have a mobile device attached via USB to a Linux box and wish to unmount it. I ran the command fdisk -l however it does not list a mount point. I then ran the command lsusb which yielded the screenshot below. I then proceeded to search the /dev/disk/by-id directory and was found the following file The file is a symbolic link to what appears to be /dev/sdc Questions Why does it not appear when I run the command fdisk -l? How do I unmount it properly without simply yanking the USB cord from the USB port?

    Read the article

  • how to properly Install chromium from zip and make it the default browser

    - by ClarifyLinux
    Since the Chromium PPA is no longer maintained, for those of us preferring to use chromium over chrome, we have two options: Build and Install from Source Download either 'beta' or daily builds (in a zip file) Unfortunately for me, option 1 is overly complicated. I know how to compile most any other applications in Ubuntu but I've never been able to get chromium to build correctly. I am currently using option 2. In Chromium I have the Chromium Updater installed (http://goo.gl/ffAMy). This gives me quick access to the most recent 64bit versions. Once downloaded, I install to /home/myuser/opt/chrome-linux. From this directory I can run the chrome binary. It works perfectly except for the fact that I cannot get it to act as my default browser. I've tried, as root, installing the binary in /opt/chrome-linux/ with a symbolic link to the 'chrome' binary in /usr/bin. Unfortunately, this doesn't work as a non root user. So my question is - How do I properly install a downloaded chromium zip build so tht it's listed as an option for the default browser?

    Read the article

  • How do I perform commands in another folder, without repeating the folder path?

    - by Valter Henrique
    Is there a clever way to do copy and move operations or a command to duplicate a file, without having to do a cd, then mv after, at the same folder? For example, I have to run the following: mv /folder1/folder2/folder3/file.txt /folder1/folder2/folder3/file-2013.txt Note that the directory to where I'm moving the file is the same, but I have to put the whole path again and sometimes it gets annoying. I'm curious to know if there's another way to do that without having to put the whole path again, because the operation would be done in the same path.

    Read the article

  • LAMP setup - phpmyadmin says the mysqli extension is missing (but its listed in phpinfo)

    - by WebweaverD
    I regularly set up virtualbox ubuntu setups to run as local webservers. I have set these up several times and never had an issue. Recently I have been cloning them but wanted to do a fresh install this time in the hopes of fixing some niggling problems which have propagated through my setups. However, something has changed: 1)vb guest additions no longer allow me to copy and paste (i'll worry about that later) 2)more importantly phpmyadmin no longer works as installed - Initially going to localhost/phpmyadmin gave a message that the page could not be found. So I have followed some instructions (sorry I know its vague I cant find them now) which have created a phpmyadmin directory in /var/www but now I get an error saying: the mysqli extension is missing. If I run phpinfo mysql and mysqli are listed. All I have done so far is install apache2 (working) install php5 (which I think used to come with apache) Install mysql server (and client for good measure) and install phpmyadmin I found a post of a similar question which suggested I should install php5-mysql (done) and edit php.ini and uncomment the line extension-mysqli.so - this is not there, so I tried adding it with no joy. I have restarted apache and still no joy on phpmyadmin. Any help is much appreciated as this is driving me nuts. Why the change for the worse - I was just starting to like linux! I'm running a windows 7 machine and the guest os is ubuntu 12.04 - I ran apt-get update before doing anything so all packages should be the latest versions.

    Read the article

  • problem with MySQL installation : template configuration file cannot be found

    - by user35389
    Trying to install MySQL onto the Windows XP machine. While going through the installation steps (in the "MySQL Server Instance Config. Wizard"), I get to a point where it the window reads: MySQL Server Instance Configuration (bold header) Choose the configuration for the server instance. Ready to execute... o Prepare configuration o Write configuration file o Start service o Apply security settings (this line is greyed out) Please press [Execute] to start the configuration. [ Back ] [ Execute ] [ Cancel ] So I press execute, and then a red X appears in the second step: Write configuration file and at the bottom, where it originally said: Please press [Execute] to start the configuration. It now says: The template configuration file cannot be found at C:\Program Files\MySQL\MySQL Server 5.0\bin\my-template.cnf I'm unsure what it means, but I canceled the config wizard and looked in the directory that had been created (C:\Program Files\MySQL\MySQL Server 5.0). There are some configuration settings files, and there are 4 folders: bin data Docs share

    Read the article

  • Setup CENTOS Centralized AUDIT and RSYSLOG server

    - by Warron.French
    Attempting to use these links: Sending audit logs to SYSLOG server or http://wiki.rsyslog.com/index.php/Centralizing_the_audit_log I have been unable to get centralized AUDIT logging to work on my ALL-CentOS network environment. I have 6 workstations dt1...dt6, and the log files are not generated at all and I cannot tell if the messages are being sent from these workstations: dt1..dt6 over to the server (srv1). I have configured the rsyslog.conf on the workstations as shown in the link: Sending audit logs to SYSLOG server, and add the additional touches for generating the logfiles into a separate directory per YEAR/MONTH/DAY (using proper syntax) and into separate HOSTNAME-based_audit.log files. Note: RSYSLOG messaging does appear to work from the workstations over to the server, but the audit logging portion is not working. I am running CentOS-6.5 with RPMs: audit-2.2-4.el6_5.x86_64, audit-libs-2.2-4.el6_5.x86_64, and rsyslog-5.8.10-8.el6.x86_64 I have gotten zero responses from wiki.rsyslog.com and really need this to work. If needed I can send files of one of my workstations and the server to aid in the process. Thanks, Warron

    Read the article

  • Is Rsync like subversion, but for a server?

    - by johnlai2004
    I'm trying to learn how to use rsync. I want to create daily backs up of my production server. Right now I run the command rsync -azr /var/www/* [email protected]:/var/www Now let's say one day, I want to roll back the /var/www/ directory on my production server to last month's version. How do I tell rsync to retrieve version N? On reading that rsync only copies differences between src and dest, I assumed rsync works like subversion where you commit changes to a destination, and keep track of every version, and with the option to checkout any version at anytime. Is that the way rsync works? It's like subversion but for an entire server? That would be great because then it means I don't have to do full ssh copies for my nightly backups.

    Read the article

  • How do I share my iPhoto photos with my ubuntu partition?

    - by Taryn East
    I have a MacBook Pro dual-booted with Snow Leopard and Ubuntu Karmic. I have recently imported hundreds of my photos into iPhoto - but I now want to be able to see them (and use them as desktop/screen saver images) from my Ubuntu partition (ie when the machine is running Ubuntu instead of MacOS). Is there an easy way to do this direct from the iPhoto library or do I have to shift them all out to an external file directory or something? Further edit - just to make it clear: I have already uploaded my photos directly into iPhoto - then spent many days categorising, tagging and uploading to flickr. Unless there's something I'm missing, I'm guessing it's likely too late to do the "don't copy into the iPhoto library" option. Happy to be proven wrong :) Perhaps somebody knows of a way to "export" the library without losing any of the current information - so that I can (from then on) keep the photos in an external library? I don't want to do this, though, if I lose the information that is currently there.

    Read the article

  • Package libxul not fount - Kiwix Wikpedia in Ubuntu Precise 12.04

    - by JHOSmAN
    I'm trying to install the service Kiwix but I need a library that is not available for Ubuntu 12.04 LTS Precise leave the log and if someone could tell me how to install Seller would appreciate. kiwix-0.9# ls aclocal.m4 COMPILE config.sub COPYING install-sh ltmain.sh missing static AUTHORS config.guess configure depcomp kiwix Makefile.am README CHANGELOG config.log configure.ac desktop libxul-dev_1.8.1.16+nobinonly-0ubuntu1_all.deb Makefile.in src root@ubuntu-MM061:/home/ubuntu/Escritorio/kiwix-0.9# ./configure checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking for a thread-safe mkdir -p... /bin/mkdir -p checking for gawk... no checking for mawk... mawk checking whether make sets $(MAKE)... yes checking whether to enable maintainer-specific portions of Makefiles... no checking for gcc... gcc checking whether the C compiler works... yes checking for C compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... none needed checking for style of include used by make... GNU checking dependency style of gcc... gcc3 checking for g++... g++ checking whether we are using the GNU C++ compiler... yes checking whether g++ accepts -g... yes checking dependency style of g++... gcc3 checking for g++... g++ checking for cl... no checking for cl... no checking for Xcode... no checking for jar... jar checking build system type... i686-pc-linux-gnu checking host system type... i686-pc-linux-gnu checking for a sed that does not truncate output... /bin/sed checking for grep that handles long lines and -e... /bin/grep checking for egrep... /bin/grep -E checking for fgrep... /bin/grep -F checking for ld used by gcc... /usr/bin/ld checking if the linker (/usr/bin/ld) is GNU ld... yes checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B checking the name lister (/usr/bin/nm -B) interface... BSD nm checking whether ln -s works... yes checking the maximum length of command line arguments... 1572864 checking whether the shell understands some XSI constructs... yes checking whether the shell understands "+="... yes checking for /usr/bin/ld option to reload object files... -r checking for objdump... objdump checking how to recognize dependent libraries... pass_all checking for ar... ar checking for strip... strip checking for ranlib... ranlib checking command to parse /usr/bin/nm -B output from gcc object... ok checking how to run the C preprocessor... gcc -E checking for ANSI C header files... yes checking for sys/types.h... yes checking for sys/stat.h... yes checking for stdlib.h... yes checking for string.h... yes checking for memory.h... yes checking for strings.h... yes checking for inttypes.h... yes checking for stdint.h... yes checking for unistd.h... yes checking for dlfcn.h... yes checking whether we are using the GNU C++ compiler... (cached) yes checking whether g++ accepts -g... (cached) yes checking dependency style of g++... (cached) gcc3 checking how to run the C++ preprocessor... g++ -E checking for objdir... .libs checking if gcc supports -fno-rtti -fno-exceptions... no checking for gcc option to produce PIC... -fPIC -DPIC checking if gcc PIC flag -fPIC -DPIC works... yes checking if gcc static flag -static works... yes checking if gcc supports -c -o file.o... yes checking if gcc supports -c -o file.o... (cached) yes checking whether the gcc linker (/usr/bin/ld) supports shared libraries... yes checking whether -lc should be explicitly linked in... no checking dynamic linker characteristics... GNU/Linux ld.so checking how to hardcode library paths into programs... immediate checking whether stripping libraries is possible... yes checking if libtool supports shared libraries... yes checking whether to build shared libraries... yes checking whether to build static libraries... yes checking for ld used by g++... /usr/bin/ld checking if the linker (/usr/bin/ld) is GNU ld... yes checking whether the g++ linker (/usr/bin/ld) supports shared libraries... yes checking for g++ option to produce PIC... -fPIC -DPIC checking if g++ PIC flag -fPIC -DPIC works... yes checking if g++ static flag -static works... yes checking if g++ supports -c -o file.o... yes checking if g++ supports -c -o file.o... (cached) yes checking whether the g++ linker (/usr/bin/ld) supports shared libraries... yes checking dynamic linker characteristics... GNU/Linux ld.so checking how to hardcode library paths into programs... immediate checking for ranlib... (cached) ranlib checking whether make sets $(MAKE)... (cached) yes checking for pkg-config... pkg-config checking for perl... perl checking fcntl.h usability... yes checking fcntl.h presence... yes checking for fcntl.h... yes checking float.h usability... yes checking float.h presence... yes checking for float.h... yes checking libintl.h usability... yes checking libintl.h presence... yes checking for libintl.h... yes checking limits.h usability... yes checking limits.h presence... yes checking for limits.h... yes checking stddef.h usability... yes checking stddef.h presence... yes checking for stddef.h... yes checking for stdint.h... (cached) yes checking for stdlib.h... (cached) yes checking for string.h... (cached) yes checking for strings.h... (cached) yes checking sys/socket.h usability... yes checking sys/socket.h presence... yes checking for sys/socket.h... yes checking sys/time.h usability... yes checking sys/time.h presence... yes checking for sys/time.h... yes checking for unistd.h... (cached) yes checking wchar.h usability... yes checking wchar.h presence... yes checking for wchar.h... yes checking for stdbool.h that conforms to C99... yes checking for _Bool... no checking for inline... inline checking for int16_t... yes checking for int32_t... yes checking for int64_t... yes checking for int8_t... yes checking for off_t... yes checking for pid_t... yes checking for size_t... yes checking for uint16_t... yes checking for uint32_t... yes checking for uint64_t... yes checking for uint8_t... yes checking for ptrdiff_t... yes checking vfork.h usability... no checking vfork.h presence... no checking for vfork.h... no checking for fork... yes checking for vfork... yes checking for working fork... yes checking for working vfork... (cached) yes checking for stdlib.h... (cached) yes checking for GNU libc compatible malloc... yes checking for working strtod... yes checking for getcwd... yes checking for gettimeofday... yes checking for memmove... yes checking for memset... yes checking for pow... yes checking for regcomp... yes checking for sqrt... yes checking for strcasecmp... yes checking for strchr... yes checking for strdup... yes checking for strerror... yes checking for strtol... yes Package libxul was not found in the pkg-config search path. Perhaps you should add the directory containing libxul.pc' to the PKG_CONFIG_PATH environment variable No package 'libxul' found Package libxul was not found in the pkg-config search path. Perhaps you should add the directory containinglibxul.pc' to the PKG_CONFIG_PATH environment variable No package 'libxul' found checking for /stable... no checking for "/nsISupports.idl"... no configure: error: unable to find nsISupports.idl apt-get install libxul Leyendo lista de paquetes... Hecho Creando árbol de dependencias Leyendo la información de estado... Hecho E: No se ha podido localizar el paquete libxul

    Read the article

  • Connecting to localhost with smartphone: possible?

    - by Arturas Molcanovas
    I am currently developing a mobile PHP project on localhost (or, to be more precise, a locally hosted website reachable at http://azgoth/) on my desktop computer and wish to check out how the design looks on my phone's various browsers, however, since the project's actually in my computer rather than the public net, I am unable to do so. Both my Desktop computer and smartphone connect to the internet via the same router, so I wondered, would it be possible to somehow make http://azgoth/ accessible to my smartphone in a similar way that my desktop computer accesses it, without making it public for everyone to see? OS: Windows XP SP 3 Service: Apache HTTPD Router: D-Link DIR-300 FTP Directory

    Read the article

  • rsync doesn't use delta transfer on first run

    - by ockzon
    I'm trying to synchronize a large local directory (with a batch file using rsync 3.0.7 on Cygwin, Windows 7 x64, 30k files, 200gb size) to a remote server (Debian x64 with kernel 2.6, rsyncd 3.0.7) over a slow internet connection (90kbyte/s upload). I know almost all files are identical and I verified that using md5sum locally and remotely. However when executing rsync from my local machine every file gets transferred completely for the first time. When I terminate the batch file after a few transfers and run it again then the already transferred files are skipped. But as soon as it gets to a file not yet transferred it uploads the file as a whole again instead of noticing that the checksum is the same locally and remotely. The batch file calling rsync looks like this (backslashes and line brakes added here for readability): c:\cygwin\bin\rsync.exe --verbose --human-readable --progress --stats \ --recursive --ignore-times --password-file pwd.txt \ /cygdrive/d/ftp/data/ \ rsync://[email protected]:33400/data/ | \ c:\cygwin\bin\tee.exe --append rsync.log I experimented using the following parameters in varying combinations but that didn't help either: --checksum --partial --partial-dir=/tmp/.rsync-partial --compress

    Read the article

  • Windows file server access control by device

    - by Ori Shavit
    I'm trying to build a system where access to certain resources (file shares) in Windows Server, is limited not only by the username (in a Active Directory domain), but also by the client machine. So far, I haven't found a good way to do this; adding the computer account to the DACL is apparently not the way to do it. Windows Server 2012 supports this with Dynamic Access Control, but this method requires all clients to be Windows 8, it seems, with no way to use this with Windows 7 clients. Is there a supported way to do this? (or alternatively, add support for device authorization with Windows 7).

    Read the article

  • Automatically cycle numerous or large files to the trash

    - by minameismud
    I've been tasked with fixing a vendor's program that, under certain conditions, dumps gigs of junk files into a log directory. It ends up filling users' machines. My task is to figure out how to make it stop without any source code or additional running processes, and without making the program kasplode. In other words, I'm looking to use a feature of the file system to control the growth. One idea I had was to make a hard link from that folder to NUL, as you might with /dev/null in the linux world. However, my attempts to use the mklink program to create a junction result in a message that says Local volumes are required to complete the operation. Any ideas on how to complete the junction, or other ideas to solve the problem?

    Read the article

  • How to identify process that's sending error messages to terminal?

    - by kjo
    The following error message occasionally appears in my terminal: Failed to open VDPAU backend libvdpau_nvidia.so: cannot open shared object file: No such file or directory ...which is pretty annoying. I have searched online for solutions to this error, without success. Is there any way to at least identify the process responsible for sending these error messages to my terminal? EDIT: Let me clarify that, as far as I can tell, these error messages appear "out of the blue". In fact, they appear asynchronously with respect to my interactions with the terminal (more often than not see them for the first time when I return to a terminal window that has been unattended for some time). I'm sure there's a definite, deterministic cause for these messages, but it is not one that I can readily identify. In short, I have not noticed any pattern or regularity to their occurrence. In particular, in my case their occurrence has nothing to do with running mplayer, or any other video playback program. (Please my earlier post about it here.) For one thing, the machine in question is a work machine, and I rarely watch any videos with it. In the very few instances in which I've watched a video on this machine I've used VLC, not mplayer, and these errors never appeared in these rare occasions that I used VLC.

    Read the article

  • Exited Emacs without saving, did it save the file somewhere?

    - by rumtscho
    Hi, I am using Emacs on Windows XP. Had it open to take some notes in a meeting, but forgot to create a file first (the text got created in the usual "scratch" buffer). Then I closed a lot of applications after the meeting, forgetting to save the notes in Emacs :( I know that when I am modifying an existing file, it creates a backup of the old file in the same directory. But I don't know if the information from the scratch buffer is available somewhere after an exit without saving. Do you know if I can restore my information? I haven't shut down/suspended/hybernated the system since, so all temp files should be still accessible.

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Eclipse: Organising Files

    - by someguy
    I want to import a project that I'm planning to build upon. The problem is that it is very messy; with source files, class files and libraries under one directory. How would I organise these files using Eclipse? I know you can change the source folder and output folder, but when I do change the source folder, the files that I want inside it do not physically move to that folder. Output folder is fine, though. Also, I would like a separate folder for libraries. I'm not sure how I would go about this, however. Here's how I would like it: src: This folder will contain source files. bin: This folder will contain binary (class) files. lib: This folder will contain external libraries.

    Read the article

< Previous Page | 526 527 528 529 530 531 532 533 534 535 536 537  | Next Page >