Search Results

Search found 16602 results on 665 pages for 'directory'.

Page 365/665 | < Previous Page | 361 362 363 364 365 366 367 368 369 370 371 372  | Next Page >

  • ASP.Net Development Tips

    Opening ASP.NET 3.5 websites in VWD 2010 VWD Express 2010 by default supports ASP.NET 4.0. If you are opening old projects that are based on either ASP.NET 3.5 or ASP.NET 2.0, you need to make some adjustments. Refer to the steps below: 1. Back up the folder containing your ASP.NET 3.5 website files and place it in another directory. For example, suppose this is the path of your original ASP.NET website that needs to be opened in VWD 2010: L:aspdotnetprojectsareaofcirclefunction Copy that folder (do not cut it) and put it in a separate folder that can be accessed by VWD 2010. By copying the fo...

    Read the article

  • why would Remmina stop working?

    - by Chris Curvey
    Until sometime last night, I had remmina working fine. I could run RDP through an SSH tunnel and all was well. Then it stopped working. I can get as far as the password dialog for my work machine, but then it just says "Cannot connect to RDP server localhost". I can't even find any logs that look interesting. I've re-installed remmina, cleared my .remmina directory, restarted my machine, and even restarted my gateway. Just to make it really weird, my laptop (which has the same setup -- latest Ubuntu and Remmina) can make the connection just fine. It is even going through the same router, albeit wirelessly. Any thoughts?

    Read the article

  • Is there a better approach in migrating SIT SVN to UAT SVN?

    - by huahsin68
    In web development, given a same piece of source code, and being deploy to SIT (system integration testing) SVN/WAS and UAT (user acceptance testing) SVN/WAS. Please take note that I am using Jenkins to build everything. I have already ensured the transition from SIT SVN to UAT SVN are sync by doing manual diff on the 2 directory. Usually I will ensure the SIT WAS is working fine then only deploy to UAT WAS. But now there is a problem show up in UAT WAS and it is working fine in SIT WAS. I am suspecting there is a migration fault happened between SIT SVN to UAT SVN. In such a given scenario, is there a better approach to handle this problem?

    Read the article

  • unable to mount internal disk mount exited with exit code 13

    - by Masri
    My Ubuntu get into error when I try to mount one of my internal disks and it gives this error message: Error mounting: mount exited with exit code 13: $MFTMirr does not match $MFT (record 3). Failed to mount '/dev/sda7': Input/output error NTFS is either inconsistent, or there is a hardware fault, or it's a SoftRAID/FakeRAID hardware. In the first case run chkdsk /f on Windows then reboot into Windows twice. The usage of the /f parameter is very important! If the device is a SoftRAID/FakeRAID then first activate it and mount a different device under the /dev/mapper/ directory, (e.g. /dev/mapper/nvidia_eahaabcc1). Please see the 'dmraid' documentation for more details. pls advise how to solve above error ,Many thanks to you in advance.

    Read the article

  • Getting a Conexant CX23885 TV Capture Card working

    - by Benny
    I'm new to Linux, and am trying to get my Capture Card working on 11.04. The only command that I know to run to find out any information is lspci, which tells me that I have 02:00.0 Multimedia video controller: Conexant Systems, Inc. CX23885 PCI Video and Audio Decoder (rev 04) I've looked at using Me TV, but haven't worked out how to configure it for my card, or what I need to do to get it running. I'm not fussed on what software I use to run the Capture Card, but I've currently got only Me TV installed. Edit: When I run tvtime, I get the following errors: videoinput: Cannot open capture device /dev/video0: No such file or directory mixer: find error: Success mixer: Can't open mixer default, mixer volume and mute unavailable. mixer: Can't open device default/Line, mixer volume and mute unavailable. Segmentation fault

    Read the article

  • New vs Clone Git in Eclipse with EGit

    - by Matty F
    I'm not sure I have either my new repo or clone repo workflow, or both, setup correctly. When I create a new project I create a repo on github, can't clone from it as it's empty so I create a new project which goes into my workspace and then the git init runs on the workspace copy. So I end up with everything in workspace\project-name. However, when I clone from github first I need to clone the repo and this goes into my default git directory (C:\git) as git\cloned-project-name, I then need to import this Git repo as a project into my workspace and I end up with workspace\cloned-project-name effectively duplicating the project folder in the git area. I've tried to clone to workspace\cloned-project-name but then it asks to import the git project and if I try to use workspace\cloned-project-name again, it errs. What am I doing wrong? Thanks, Matt.

    Read the article

  • Creating symlink for Postgres

    - by Edwin
    As a developer, I often ssh right into my local database, just to test my application before pushing my code. However, I find that every time I want to access Postgres, I have to type in postgres@ubuntu:~$ /usr/local/pgsql/bin/psql test whereas on my work machine, all I have to do is type postgres@ubuntu:~$ psql --dbname=test --username=user I tried creating a symlink, which was successful, but whenever I try connecting to it through this shortcut, I get the following error message: psql: could not connect to server: No such file or directory Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"? How do I get this to work? In case it makes any difference, I'm using a self-compiled version of the 9.1.x series.

    Read the article

  • Sound is not working correctly on Ubuntu 12.04

    - by Jeggy
    I know this is my own fault. But what i did was this first i wrote this command 'sudo apt-get remove pulseaudio' and then i wrote again 'sudo apt-get install pulseaudio' and now the sound doesn't work properly And the Indicator doesn't work either, it's just grayed out. The shortcuts are not working either. Alsamixer is working, and this is the only way i change change the volume at the moment: jeggy@jeggy-XPS:~$ cat /proc/asound/cards 0 [PCH ]: HDA-Intel - HDA Intel PCH HDA Intel PCH at 0xf1c00000 irq 52 jeggy@jeggy-XPS:~$ aplay -l **** List of PLAYBACK Hardware Devices **** ALSA lib conf.c:1686:(snd_config_load1) _toplevel_:11:0:Unexpected end of file ALSA lib conf.c:3406:(config_file_open) /etc/asound.conf may be old or corrupted: consider to remove or fix it /usr/bin/pulseaudio: error while loading shared libraries: libpulsecommon-1.1.so: cannot open shared object file: No such file or directory card 0: PCH [HDA Intel PCH], device 0: ALC665 Analog [ALC665 Analog] Subdevices: 0/1 Subdevice #0: subdevice #0 card 0: PCH [HDA Intel PCH], device 1: ALC665 Digital [ALC665 Digital] Subdevices: 1/1 Subdevice #0: subdevice #0 card 0: PCH [HDA Intel PCH], device 3: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 VLC sound is not working, am getting this error:

    Read the article

  • Sub domain on root domain

    - by dror
    I have a site, actually a "portal"/ "directory" for service providers. Now, for start, we opened every service provider own page on our site, but now we get a lot of applications from those providers that thy want sites from their own. We want to make every service provider his own site, but on sub domain url. ( they don’t mind… its ok for them) So, my site is www.exaple.com There site will be: provider.exaple.com Now I have two questions: can it harm my site in SEO? if one from those sub domain , punished by Google because is owner do "black hat seo" , how it will affect the rood domain? It can make the root domain to get punished?

    Read the article

  • Disable suspend / hibernate via the policykit

    - by redonath
    I am trying to run Lubuntu 12.04, but if the computer suspends I am unable to bootup again. Instead I see the bios post, the hard disk light flickers once and I have to install again (I have tried to re-install grub2). I am new to Linux and what I found that best answered my question was posted by James Henstridge. The instructions say to create disable-shutdown.pkla in etc/polkit-1/50-local.d/ but this directory does not exist, so do I create a folder titled 50-local.d in poolkit-1 or do I have to place this file elsewhere?

    Read the article

  • Can't untar Testdisk 6.14 using live CD

    - by Orestes
    I'm using an Ubuntu LiveCD right now and need to recover files from a (Windows 7) partition. I've read about TestDisk and tried downloading it and untaring it but: testdisk-6.14-WIP.linux26.tar.bz2: Cannot open: No such file or directory tar (child): Error is not recoverable: exiting now tar: Child returned status 2 tar: Error is not recoverable: exiting now I don't know why it doesn't work (noob). I tried using sudo apt-get install testdisk but: Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package testdisk so.... HELP :D

    Read the article

  • Games installed from repos won't start

    - by Shauna
    I'm running Natty (upgraded from Maverick) with Gnome 3 (from the Gnome 3 PPA, Unity removed), and have recently found that some of my games from the repos no longer work. When I go to start them, I get the message Failed to launch [app name]. Failed to execute child process [app name] (no such file or directory). I've so far found this on Gweled and PyScrabble. Other games (Mines, Sudoku, Mahjongg), as well as other applications have opened just fine. Gweled used to open fine until recently. Any ideas on how to fix this?

    Read the article

  • Cannot get debconf version after delete /var/lib/dpkg

    - by pije76
    As a new ubuntu user, I just make a mistaken. I've deleted a folder /var/lib/dpkg/ instead of /var/lib/dpkg/lock :) Now when I execute apt-get -f install then it will display error message: ... E: Cannot get debconf version. Is debconf installed? debconf: apt-extracttemplates failed: No such file or directory ... I've try this tutorial: http://people.adams.edu/~cdmiller/posts/Ubuntu-dpkg-recovery/ but still no luck. How can I fix this issue?

    Read the article

  • ralink rt2860 not working anymore in ubuntu 12.10

    - by greenit
    I just upgraded from Ubuntu 12.04 to 12.10, however, my network card is not working anymore... When i had 12.04 installed, i just had to download the driver from Ralink, go to the directory and enter these commands: $ make $ sudo make install and the network card worked... now, when I want to do this in ubuntu 12.10, it seems like it works, but it doesn't. I can enable the module, it recognizes the network card, but i am unable to connect to any wireless network... How can i get my wireless card working again? If any1 knows a solution, please tell it to me, thx in advance

    Read the article

  • How do I copy an existing hard disc to a new one so I can boot off the new disc?

    - by Brian Hooper
    I currently have a failing hard drive which is the only hard drive in the machine. I have just bought a new hard drive to replace it, and my plan is to copy the contents of the old drive onto the new one, and then replace the old drive in the machine with the new one. I presumably can't just copy the whole directory structure (or can I)? What do I need to do to manage this, assuming it is possible? Is there a utility to do this for me? (The old drive is hopefully good for a few more hours.) (I hope by this means to keep all the software and configuration files as they are, to avoid having to re-install everything. Can that be done?)

    Read the article

  • How to properly set up Sun's JDK?

    - by jurchiks
    I'm trying to manually install the Sun JDK package (I have my reasons, don't bother asking why). I've successfully extracted the .bin file into /usr/lib/jvm/jdk1.6.0_23, but the problem is the PATH variable. I added this line to the /etc/environment file: JAVA_HOME="/usr/lib/jvm/jdk1.6.0_23" and added JAVA_HOME/bin to the PATH variable, BUT the OS still doesn't recognise the command java, says it's not installed and offers me gcj and openjdk. There was another way by using java-package and converting the .bin to .deb installer, but unfortunately that package is not available on/for maverick, so I can't do it that way. How can I make the PATH variable work and is there anything else required apart from the environment variables to make it all work? When I try to use the update-java-alternatives -l command, it says the following: awk: cannot open /usr/lib/jvm/*.jinfo (No such file or directory) jdk1.6.0_23 /usr/lib/jvm/jdk1.6.0_23 What should be the name of the file and the contents of it?

    Read the article

  • bash script move file to folders based in name

    - by user289111
    I hope you can help me... I made a perl and bash script to make a backup of my firewalls and tranfers via tftp #!/bin/sh perl /deploy/scripts/backups/10.160.23.1.pl > /dev/null 2>&1 perl /deploy/scripts/backups/10.160.23.2.pl > /dev/null 2>&1 so this tranfers the file to my tftp directory /tftpboot/ ls -l /tftpboot/ total 532 -rw-rw-rw- 1 tftp tftp 209977 jun 6 14:01 10.160.23.1_20140606.cfg -rw-rw-rw- 1 tftp tftp 329548 jun 6 14:02 10.160.23.2_20140606.cfg my questions is how to improve my script to moving this files dynamically to another folder based on the name (in this case on the ip address) for example: 10.160.23.1_20140606.cfg move to /deploy/backups/10.160.23.1/ is that the answer to this surely was on Google, but wanted to know if there was a particular solution to this request and also learn how to do :) Thanks!

    Read the article

  • SQL SERVER – ?Finding Out What Changed in a Deleted Database – Notes from the Field #041

    - by Pinal Dave
    [Note from Pinal]: This is a 41th episode of Notes from the Field series. The real world is full of challenges. When we are reading theory or book, we sometimes do not realize how real world reacts works and that is why we have the series notes from the field, which is extremely popular with developers and DBA. Let us talk about interesting problem of how to figure out what has changed in the DELETED database. Well, you think I am just throwing the words but in reality this kind of problems are making our DBA’s life interesting and in this blog post we have amazing story from Brian Kelley about the same subject. In this episode of the Notes from the Field series database expert Brian Kelley explains a how to find out what has changed in deleted database. Read the experience of Brian in his own words. Sometimes, one of the hardest questions to answer is, “What changed?” A similar question is, “Did anything change other than what we expected to change?” The First Place to Check – Schema Changes History Report: Pinal has recently written on the Schema Changes History report and its requirement for the Default Trace to be enabled. This is always the first place I look when I am trying to answer these questions. There are a couple of obvious limitations with the Schema Changes History report. First, while it reports what changed, when it changed, and who changed it, other than the base DDL operation (CREATE, ALTER, DELETE), it does not present what the changes actually were. This is not something covered by the default trace. Second, the default trace has a fixed size. When it hits that size, the changes begin to overwrite. As a result, if you wait too long, especially on a busy database server, you may find your changes rolled off. But the Database Has Been Deleted! Pinal cited another issue, and that’s the inability to run the Schema Changes History report if the database has been dropped. Thankfully, all is not lost. One thing to remember is that the Schema Changes History report is ultimately driven by the Default Trace. As you may have guess, it’s a trace, like any other database trace. And the Default Trace does write to disk. The trace files are written to the defined LOG directory for that SQL Server instance and have a prefix of log_: Therefore, you can read the trace files like any other. Tip: Copy the files to a working directory. Otherwise, you may occasionally receive a file in use error. With the Default Trace files, if you ask the question early enough, you can see the information for a deleted database just the same as any other database. Testing with a Deleted Database: Here’s a short script that will create a database, create a schema, create an object, and then drop the database. Without the database, you can’t do a standard Schema Changes History report. CREATE DATABASE DeleteMe; GO USE DeleteMe; GO CREATE SCHEMA Test AUTHORIZATION dbo; GO CREATE TABLE Test.Foo (FooID INT); GO USE MASTER; GO DROP DATABASE DeleteMe; GO This sets up the perfect situation where we can’t retrieve the information using the Schema Changes History report but where it’s still available. Finding the Information: I’ve sorted the columns so I can see the Event Subclass, the Start Time, the Database Name, the Object Name, and the Object Type at the front, but otherwise, I’m just looking at the trace files using SQL Profiler. As you can see, the information is definitely there: Therefore, even in the case of a dropped/deleted database, you can still determine who did what and when. You can even determine who dropped the database (loginame is captured). The key is to get the default trace files in a timely manner in order to extract the information. If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • How do you name your personal libraries?

    - by Mehrdad
    I'm pretty bad with naming things. The only name I can every generically come up with is 'helper'. Say, if I have a header file that contains helping functions for manipulating paths, I tend to put it inside my "helper" directory and call it "path-helper.hpp" or something like that. Obviouslly, that's a bad naming convention. :) I want to have a consistent naming scheme for my folder (and namespace) which I can use to always refer to my own headers and libraries, but I have trouble finding names that are easy to type or remember (like boost)... so I end up calling some of them "helper" or "stdext" or whatnot, which isn't a great idea. How do you find names for your libraries that are easy to remember and easy to type, and which aren't too generic (like "helper" or "std" or "stdext" or the like)? Any suggestions on how to go about doing this?

    Read the article

  • Problems, connecting Android ICS to Ubuntu using MTP

    - by ubuntico
    I've followed this tutorial from this blog which very clearly explains how to connect Android phone with ICS to Ubuntu so that one can access phone's sdcard (MTP access). I passed all the procedure with no errors, I can event attach my mobile to ubuntu via mtpfs -o allow_other ~/Android/GalaxyS2 and disconnect via fusermount -u ~/Android/GalaxyS2 The problem comes when I try to access mounted directory. If I try to do it via Nautilus, the system tries to open the folder for a couple of minutes and then, I either see the error, or the folder disappears from Nautilus (it comes back when I disconnect the path). I also get a console error: fuse: bad mount point `~/Android/GalaxyS2': Transport endpoint is not connected I see many people on the net reporting this error, but noone offers any solution to it. I use Ubuntu 11.10 with Gnome Shell (Gnome 3) and the mobile is Samsung Galaxy S II. I am in the fuse list, I did all the steps in the tutorial for dozens of times, all in vain.

    Read the article

  • Drupal + LDAP + Automatic

    - by WernerCD
    I've got Drupal 6 setup within a XAMPP test area. I have LDAP authentication, groups and data working against Active Directory. What I want... is since I'm on an intranet where users are logged in via user-names... is for automatic authentication, without the need to login via the website. If it's more difficult than its worth, it's no major hassle, but I'd like to know if it's possible that when my users visit our intranet they auto-magically authenticate with their already logged in Windows session. Ultimately, I may switch to IIS, but I do like having a portable, easy to backup/copy/test setup so for now I'm going to see if I can get this working in XAMPP.

    Read the article

  • How do you force Ubuntu to unmount a disk when you press the eject button on an optical drive?

    - by Michael Curran
    When upgrading my hardware, I also upgraded to Ubuntu 10.10. On my previous system (with 10.04 and earlier) when I ejected a disk from the optical drive, the subfolder in the /media directory was automatically removed. In my new 10.10 system, if I don't eject the disk using the "eject" command within the system, the disk remains mounted, even after a new disk is installed. The new drive is a Blu Ray drive, but I haven't noticed any other problems from it. Normally, this isn't a problem, but it makes installing applications that are spread over multiple CDs more difficult in many cases (i.e. Wine). Any advice?

    Read the article

  • Can't get Broadcom 43142 drivers to work after installation on 5420 Inspiron

    - by beckett
    I'm a complete newbie to ubuntu, I've already installed and reinstalled the Broadcom driver required done a numerous amount of steps on terminal but I can't seem to get wireless working. However, my wired connection seems to work just fine. also I have tried to follow the instructions but get stuck at step number 2 when i get told "No package dpkg is available": 1) Download the file from the link 2) sudo yum install dpkg 3) mkdir BCM43142 4) dpkg-deb -x Downloads/wireless-bcm43142-dkms-6.20.55.19_amd64.deb BCM43142 5) cd BCM43142/usr/src/wireless-bcm43142-oneiric-dkms-6.20.55.19~bdcom0602.0400.1000.0400/src/wl/sys 6) sudo yum install kernel-devel kernel-headers 7) vi wl_linux.c 8) around line 43, remove the line include 9) save the file (:wq) 10) cd ../../.. 11) make Things should work, and you'll have a file called "wl.ko" in the current directory. 12) sudo yum remove broadcom-wl 13) sudo mkdir -p /lib/modules/3.5.2-3.fc17.x86_64/extra/wl 14) sudo cp wl.ko /lib/modules/3.5.2-3.fc17.x86_64/extra/wl 15) sudo depmod -a 16) sudo modprobe wl I really need help :/

    Read the article

  • Changing from one file to another

    - by jbander
    I'm told to go to /home/jbander/Downloads, so how do I do that, I assume you do it in terminal but what do you do next, I can get to home but thats it. How do I go from one directory or file or whatever they are, to another and once I'm there what do I do to see what is in the download file. One more question if I want to change it from e.g. cow to e.g. duck how would I do that(they are just arbitrary names) how do I get rid of cow and how do I put duck in it's place.

    Read the article

  • Need Help Unable to Mount Location

    - by Don't ASk Ubun
    I am not able to start Windows and am using a DVD copy of Ubuntu to start up. I see my 750 GB Hard Disk, but if I click it i get this error: Error mounting: mount exited with exit code 13: ntfs_attr_pread_i: ntfs_pread failed: Input/output error Failed to read NTFS $Bitmap: Input/output error NTFS is either inconsistent, or there is a hardware fault, or it's a SoftRAID/FakeRAID hardware. In the first case run chkdsk /f on Windows then reboot into Windows twice. The usage of the /f parameter is very important! If the device is a SoftRAID/FakeRAID then first activate it and mount a different device under the /dev/mapper/ directory, (e.g. /dev/mapper/nvidia_eahaabcc1). Please see the 'dmraid' documentation for more details. After googling for a while I think I need to do sudo apt-get install ntfsprogs but when I try that: E: Package 'ntfsprogs' has no installation candidate My problem is a lot like this thread

    Read the article

< Previous Page | 361 362 363 364 365 366 367 368 369 370 371 372  | Next Page >