Search Results

Search found 17187 results on 688 pages for 'give me chicken'.

Page 505/688 | < Previous Page | 501 502 503 504 505 506 507 508 509 510 511 512  | Next Page >

  • How to change mount to grant user write permissions?

    - by nals
    I am on TomatoUSB, and using the feature to have a NAS. The only way I can write to the Samba share is if I force root: [global] interfaces = 127.0.0.1, 192.168.1.1/24 bind interfaces only = no workgroup = WORKGROUP netbios name = TOMATO security = share wins support = yes name resolve order = wins lmhosts hosts bcast guest account = nobody [Public] path = /mnt/sda2 read only = no public = yes only guest = yes guest ok = yes browseable = yes comment = Network share force user = root writeable = yes I dont really like the idea having to use root to allow write access to my share. I have a samba account created already named nobody to allow access to the share. However every time I try to write I get access denied error. fstab: /dev/sda2 /mnt/sda2 vfat defaults 0 0 Further more every time I try to chmod 777 /tmp/mnt/sda2 the permissions are not changed, and no error is produced. They stay 755. drwxr-xr-x 2 root root 4096 Jun 4 01:49 sda2 Basically; how can I give the user nobody write permissions to my mount? dev name: /dev/sda2 dev mount: /tmp/mnt/sda2

    Read the article

  • Retrieve a user's Exchange database in powershell

    - by Paul
    Hey Everyone, I've scoured the interwebs for a few days now off and on to find this. I am creating a powershell script for email-enabling new user's(Exchange 2007). To give you a little background when we have a new hire, their AD account is created at our off-site helpdesk, but they don't create their email account. I'm trying to automate the process of mail-enabling the user which involves putting them in the same database as an existing user, disable imap pop activesync, and lastly email the requester of the ticket. I would like to just get prompted for the New User's name, User to Replicate(mailbox, storage group, database), and the person to email after it's been created. So if someone could just help with a command to Retrieve a user's Exchange database in powershell that would be great, but if people also want to help with my hacked up script please do so as well!!! Here is what I have so far: Write-output “ENTER THE FOLLOWING DETAILS” $DName = Read-Host “User Diplay Name" $RUser = Read-Host "Replicate User(Database Grab)" ***$RData = #get the Replicate user's mailbox database here*** $REmail = #either just use a Read-Host “Requester's Email address" or ask for Requester's name and pipe through their email address by digging for it w/ powershell Enable-Mailbox -Identity "$DName" -Database "$RData" Send-MailMessage -From "John Doe <[email protected]>" -To (put $REmail here which is the Requester's email) -Subject "Test Person's email account" -Body "Test Person's email account has been setup.`n`n`nJohn Doe`nGeneric Company`nSystems Administrator`nOffice: 123.456.7890`[email protected]" -SmtpServer genericexchange.exchange.com

    Read the article

  • Creating multiple SFTP users for one account

    - by Tom Marthenal
    I'm in the process of migrating an aging shared-hosting system to more modern technologies. Right now, plain old insecure FTP is the only way for customers to access their files. I plan on replacing this with SFTP, but I need a way to create multiple SFTP users that correspond to one UNIX account. A customer has one account on the machine (e.g. customer) with a home directory like /home/customer/. Our clients are used to being able to create an arbitrary number of FTP accounts for their domains (to give out to different people). We need the same capability with SFTP. My first thought is to use SSH keys and just add each new "user" to authorized_keys, but this is confusing for our customers, many of whom are not technically-inclined and would prefer to stick with passwords. SSH is not an issue, only SFTP is available. How can we create multiple SFTP accounts (customer, customer_developer1, customer_developer2, etc.) that all function as equivalents and don't interfere with file permissions (ideally, all files should retain customer as their owner)? My initial thought was some kind of PAM module, but I don't have a clear idea of how to accomplish this within our constraints. We are open to using an alternative SSH daemon if OpenSSH isn't suitable for our situation; again, it needs to support only SFTP and not SSH. Currently our SSH configuration has this appended to it in order to jail the users in their own directories: # all customers have group 'customer' Match group customer ChrootDirectory /home/%u # jail in home directories AllowTcpForwarding no X11Forwarding no ForceCommand internal-sftp # force SFTP PasswordAuthentication yes # for non-customer accounts we use keys instead Our servers are running Ubuntu 12.04 LTS.

    Read the article

  • Move files from ftp server to s3

    - by lev
    I would like to set up an ftp server, where users will upload files, and for each file, put it on s3 storage, and delete it from the ftp server. (the server runs on ec2 ubuntu) Here are the stuff I already tried, with no success.. Mount s3 bucket using s3fs. I followed those instructions, but there is a bug in the latest version of s3fs, that prevents it from working. The bug was fixed on the develop branch, but I don't want to use unstable version on my production. Use vsftpd and using s3cmd sync via cron to sync the files periodically. The problem with that approach, is that s3cmd can start running in the middle of a file upload, and start synching the incomplete file. Also s3cmd doesn't give any feedback it the sync fails, so I have no way of knowing if I can delete the files after the sync command finished running. Use pure-ftpd's upload script feature (which allows to run a script after a file is finished uploading), but I noticed that if the file upload was failed in the middle, the script will run anyway, and I have no way of knowing if the upload was successful or not. I've been at it for a few days now, and I'm at a loss here. Any suggestions will be welcomed.

    Read the article

  • How to reproduce the behavior of Mac OS X's dead keys on Windows 7?

    - by Pascal Qyy
    I'm French, but I've chosen to take a QWERTY keyboard for my MacBook Pro for many reasons: first of all, the AZERTY keyboard is not at all ergonomic because it has no numeric keypad, and I must use MAJ or CAPS LOCK to access to the numeric keys ; secondly, I've bought this mac for development ; and chars {, }, etc., are not directly accessible on the Apple AZERTY keyboard the last thing is that: the diacritics are VERY easy to produce on an Apple keyboard with Mac OS X : ? + c for a ç, for example, and many dead keys easy to use (e.g. ? + e, then e give you an é. So, I have no difficulties to write in my native language with this keyboard under Mac OS X. BUT, when I boot on Windows 7's Boot Camp partition, or when I use applications from it through VMware Unity, it is no longer the same comfort! Without numeric keypad, it's impossible to use it for produce specials characters (e.g.: Alt + 0231 for the ç) I've tried many solutions, like auto replacement in Microsoft Office (e.g.: ,,c being replaced by ç), but for all my diacritics, I must type a space, then a back space before the replacement work. I've also tried third party software, as Texter, but it is very buggy and don't work properly (or don't work at all) in many case! So, is there a solution somewhere, to have this Mac OS X's nice and comfortable way of producing diacritics for Windows 7? Thank in advance for your help and your time!

    Read the article

  • Create a PDF that defaults to flip on short edge when printed double-sided

    - by user568458
    We're creating a 2-page PDF brochure with a target audience who will print it on their regular office or home printers. If it is printed on a double-sided printer (common in offices), it'll come out correctly if set manually by the user to "Flip on short edge", but will come out with the second page upside down if default settings are used (flip on long edge). Our target audience aren't very tech-literate, and we've found that even within our own office network there is variation in the location of the 'Flip on short edge' setting - so it isn't realistic to give everyone who downloads the PDF instructions on how to change this setting or to expect everyone to find out how to change the setting off their own backs. So, when creating a PDF (ideally using Adobe InDesign or Acrobat, but if other software or hacking is needed that's fine...), is there a way to configure the PDF file itself so that when printed double-sided with default settings, it flips on the short edge? If possible, it'll be useful supplementary info to know how reliable any such methods are across different PDF readers (e.g. Adobe Reader, Acrobat, Mac Preview, inbuilt browser readers (e.g. chrome), FoxIt, etc). If questions about content creation like this aren't a great fit here, feel free to migrate it to the graphic design stackexchange site - this question seems to fall half way between the two sites

    Read the article

  • Using Computer name in URL causes issues when connecting to Web Services

    - by AWinters
    The set of applications I work on all access the same 8 or so web services that we have. These services and applications all reside on the same box and all use the computer name when trying to connect to the web service. For Example: If I have a web service called MapDataService and I have an application that accesses it, it would access it by the URL: http://COMPUTERNAME/MapDataService/MapDataService.asmx. This works in most of the applications that access the web service. However, we have several applications that, when using the computer name in the URL, will not get data returned from the service (actually a 503 is returned). In order to get it to work, the IP address of the system needs to be used in place of the COMPUTERNAME. This strikes me as very odd considering, as I mentioned before, all applications and services are on the same box and most other applications usr the COMPUTERNAME with no issues. Can someone give me some insight as to what could be causing this? We have no access to IIS logs and what logs we did get (this is on a customer site) are not very useful.

    Read the article

  • OS X (10.6) Apache Sudden Death, Nginx not working either...

    - by Jesse Stuart
    Hi, I turned on my computer today and apache wasn't working. This is weird as its been working for the last 6 months without issue. The only thing I did which may of caused a problem, is I uninstalled a bunch of gems. This shouldn't be the issue though as apache doesn't rely on gems. I decided to give nginx a try to see if it would work and have the exact same issue. The symtoms are: I go to http://localhost and get the browsers default 404 page (not rendered by apache/nginx) No error is found anywhere (I checked all logs) Apache is rinning (also tried with Nginx) How can I debug this to find the root of the problem? I can't think of why this would be happening. I've tried repairing permissions in case this was the issue, apparently it wasn't. Everything was working the other day, and nothing changed in the apache config. Update: Here is the output of telnet localhost 80 $ telnet localhost 80 Trying ::1... telnet: connect to address ::1: Connection refused Trying fe80::1... telnet: connect to address fe80::1: Connection refused Trying 127.0.0.1... telnet: connect to address 127.0.0.1: Connection refused telnet: Unable to connect to remote host

    Read the article

  • Developing high-performance and scalable zend framework website [on hold]

    - by Daniel
    We are going to develop an ads website like http://www.gumtree.com/ (it will not be like this one but just to give you an ideea) and we are having some issues regarding performance and scalability. We are planning on using Zend Framework for this project but this is all that I'm sure off at this point. I don't think a classic approch like Zend Framework (PHP) + MySQL + Memcache + jQuery (and I would throw Doctrine 2 in there to) will fix result in a high-performance application. I was thinking on making this a RESTful application (with Zend Framework) + NGINX (or maybe MongoDB) + Memcache (or eAccelerator -- I understand this will create problems with scalability on multiple servers) + jQuery or maybe throw Backbone.js in there, a CDN for static content, a server for images and a scalable server for the requests and the rest. My questions are: - What do you think about my approch? - What solutions would you recommand for developing an high performance, scalable application expected to have a lot of traffic using PHP(Zend Framework 2)...I would be interested in your approch. I should note that I'm a Zend developer, I'm working with Zend for over 3 years, this is why I'm choosing it.

    Read the article

  • Configure samba server for Unix group

    - by Bird Jaguar IV
    I'm trying to set up a samba server with access for users in the Linux (RHEL 6) "wheel" group. I am basing smb.conf off of the example here where it goes through the [accounting] example. In my smb.conf I have [tmp] comment = temporary files path = /var/share valid users = @wheel read only = No create mask = 0664 directory mask = 02777 max connections = 0 (rest of the output from $ testparm /etc/samba/smb.conf is here). And groups `whoami` returns user01 : wheel. When I use the following command from another machine (Mac OS) as the Linux user (user01): $ smbclient -L NETBIOSNAME/tmp it asks for a password, I hit return without a password, and get: Enter user01's password: Anonymous login successful Domain=[DOMAIN] OS=[Unix] Server=[Samba 3.6.9-151.el6_4.1] Sharename Type Comment --------- ---- ------- tmp Disk temporary files IPC$ IPC IPC Service (Samba Server Version 3.6.9-151.el6_4.1) But when I try $ smbclient //NETBIOSNAME/tmp I try entering the password I use for the Linux login, and get a bunch of stuff logged, including check_sam_security: Couldn't find user 'user01' in passdb. ... session setup failed: NT_STATUS_LOGON_FAILURE (I can give more logging information if it would be helpful.) I can't find a reference to more steps I need to add group users in the resource. Should I be manually adding samba users from the group somehow? Thank you

    Read the article

  • Simple electric DC question. Currency consumption

    - by Bobb
    Suppose you have DC power supply and a consumer connected to it (i.e. computer PSU and a hard drive). Suppose PSU which was supplied with the consumer has output 5V 1A. So I assume that the consumer should not consume more than 1A. Suppose the original PSU is broken now and I want to replace it with the one I have which is 5V 10A. My guess is that current is something which depends on the consumer. So if the consumer consumes normally 1A then it will not consume more than that even if it is connected to 10A PSU. In other word - am I right assuming that the consumer will not burn out being connected to a power supply with higher current output? P.S. my understanding is that voltage is something independent from the consumer. If you give it higher voltage it will burn (voltage is from PSU to the consumer). However current must be in opposite - consumer sucks as much current as it need not as much as PSU can provide (of course given that max PSU current is greater than the consumer needs)

    Read the article

  • Symbolic link modification for HP unix

    - by kalpesh
    Hi David Zaslavsky, Recently i was working on modifying the Symbolic links ... for a particular files.. while searching on internet i saw your post ... I am trying to use this script which you had posted .. find /home/user/public_html/qa/ -type l \ -lname '/home/user/public_html/dev/*' -printf \ 'ln -nsf $(readlink %p|sed s/dev/qa/) $(echo %p|sed s/dev/qa/)\n'\ script.sh SO i tried to modify your script for my problem .. in Hp unix env.. but it seems that the -lname command does not works for HP unix. do you know something equivalent that i can use ... Just to give you and idea of my problem ... I want to change all the symbolic links inside a particular folder .. New Symbolic link -- /base/testusr/scripts Old Symbolic link -- /base/produsr/scripts Now folder "A" contains more than 100 different files having soft links which points to this path -- /base/produsr/scripts But what I want is that the files inside folder A to point to this soft link --/base/testusr/scripts I am trying to achieve in Hp unix ... would really appreciate your help on this ...

    Read the article

  • Allowing access to company files accross the internet

    - by Renaud Bompuis
    The premise I've been tasked with finding a solution to the following scenario: our main file server is a Linux machine. on the LAN, users simply access the files using SMB. each user has an account on the file server and his/her own access rights. user accounts are simple passwd/group security accounts, not NIS/LDAP. The problem We want to give users (or at least some of them, say if they belong to a particular group) the ability to access the files from the Internet while travelling. Ideally I'd like a seamless solution. Maybe something that allows the user to access a mapped drive would be ideal. A web-oriented solution is also good but it should present files in a way that is familiar to users, in an explorer-like fashion for instance. Security is a must of course, and users would be expected to log-in. The connection to the server should also be encrypted. Anyone has some pointers to neat solutions? Any experiences? Edit The client machines are Windows only.

    Read the article

  • I need a reverse proxy solution for SSH

    - by Bond
    Hi here is a situation I have a server in a corporate data center for a project. I have an SSH access to this machine at port 22.There are some virtual machines running on this server and then at the back of every thing many other Operating systems are working. Now Since I am behind the data centers firewall my supervisor asked me if I can do some thing by which I can give many people on Internet access to these virtual machines directly. I know if I were allowed to get traffic on port other than 22 then I can do a port forwarding. But since I am not allowed this so what can be a solution in this case. The people who would like to connect might be complete idiots.Who may be happy just by opening putty at their machines or may be even filezilla.I have configured an Apache Reverse Proxy for redirecting the Internet traffic to the virtual machines on these hosts.But I am not clear as for SSH what can I do.So is there some thing equivalent to an Apache Reverse Proxy which can do similar work for SSH in this situation. I do not have firewall in my hands or any port other than 22 open and in fact even if I request they wont allow to open.2 times SSH is not some thing that my supervisor wants.

    Read the article

  • Replacement for public folder workflow, I'm confused as to how sharepoint does it.

    - by RodH257
    For years Microsoft has been slowly phasing out public folders, perhaps exchange 2010 really is the LAST TIME they'll be shipped... I've heard sharepoint is the replacement, but I don't understand full, can someone give me an idea of how to replace this workflow? In our office, we have projects, they have a project number, ie 10353. Each job folder has a public folder, organized in a hierachy like Projects Year Folder Subfolders The main subfolder we use is for genera correspondence. When an email is received that relates to a project, it is dragged and dropped (or right click move to) a public folder. Adding public folder favourites for each user helps this. When an email is sent, we have a custom email form, which is the default email form, but with a project number field next to the subject line. When you enter the job number in there, it carbon copies our filing system in, which reads the job number and puts the email in the public folder for you. if you need to refer to emails, you go to public folder and find them there. This isn't the best with large jobs, but it works ok. Now, I have limited experience with sharepoint (well, WSS), we've used it to do some neat discussion boards/polls etc as an intranet site, but I haven't seen much of its integration with outlook. The great thing about our solution is how tightly it integrates with outlook which is exactly where the emails are. If you want to forward an old email, you go to public folder and forward it, simple. Any solution that replaces it should be at least as easy as this. Improvements we would like would be to have better searching of emails, better support in exchange (ie future version) and also, custom forms in outlook are being phased out (the VBA kind), so avoiding these would be good. Does sharepoint do this? or what solutions do this kind of thing?

    Read the article

  • Dropped WD External Harddisk, now it's shown as "Not initialized"

    - by Phelios
    So, the WD my passport external harddisk is dropped, and after that, the computer is unable to read it anymore. I was hopping if I can just find another case to try if the harddisk is still readable, but looks like the hard drive itself is not a normal SATA or PATA drive. I think it's modified. So, I can't find another case that I can try on. In the computer, I still can see the drive in the "Disk Management", but it's shown as Uninitialized, no size, and no drive letter. I've also tried a couple of recovery tools. Some can't detect at all, there is one (find and mount software) that can detect but shows 0 size. None of them can recover the data. WD is willing to replace it with a new one, but I still need to recover the data. Any way I can recover the data? UPDATE: I tried initialized it from the windows Disk Manager, but it give error "The request could not be performed because of an I/O device error."

    Read the article

  • Different external ip addresses from different sites

    - by user630286
    My router is ClearOS 6(Centos 6). In my router, I have two external (internet) network connections from two ISP's. The primary connection is eth2 connected to a cable modem and the second one is ppp0 connected to a dsl modem. I have assigned eth2 as the primary connection (with a high metric value). In fact this is done through clearos's multiwan web interface. I have a test in my Nagios to monitor whether the primary connection. This connection is done based on the result of curl ifconfig.me But it seems that ifconfig.me is always giving the ip address of my secondary connection. I tested it through a browser. Yes ifconfig.me gives the secondary internet's(ppp0) ip address. But whatismyipaddress.[com|org] give my primary ip address (eth2). I checked the default route on the router through ip route list 0/0 which also shows the primary connection (eth2) as the default route. The traceroute www.google.com and traceroute ifconfig.me both seems to trace through the primary connection (eth2). As our secondary internet connection has only got a limited download, I don't want to end up having to pay a large sum at the end of the month. Has somebody got an idea why the ifconfig.me shows my secondary address? What is the best way to ensure that my router(and thus the lan) use the right internet connection.

    Read the article

  • Routing and authenticating all access through squid

    - by Knight Samar
    Hi, I want to route all Internet access in my network through a Squid proxy server and authenticate and log all users. I want this to be a client-independent setting so that no one needs to do anything on their browsers or machines. I have set my network gateway as the proxy server so that all traffic will be sent to it. I have done this using options in DHCP server. Now I tried using squid as a transparent proxy, but then it won't authenticate in that mode. I tried using iptables to route all traffic to port 3128 but it won't popup the authentication dialog box from SQUID. I tried telling DHCP to give WPAD to all clients by placing a WPAD file on a webserver containing the following for automatic proxy configuration on clients: Changes in dhcpd.conf option wpad code 252 =test; option wpad "\n\000"; option wpad "http://192.168.1.5/wpad.dat\n"; The WPAD file: function FindProxyForURL(url,host) { return "PROXY squid-server-ip-address:3128 ; DIRECT "; } But the browsers (different versions of Firefox and IE) seem to ignore it. :( What should I do ?

    Read the article

  • Creating a Jenkins build farm in a hands-off manner?

    - by user183394
    My colleague and I have set up and run Jenkins on a KVM guest running Ubuntu 12.04 with good results for a while now. We are thinking about deploying a cluster of Jenkins CI hosts in master/slave configuration, with the libvirt slave plugin to keep our hardware count low. Our environment is strictly Linux (CentOS, Scientific Linux, Fedora, and Ubuntu). Both of us are competent in setting up large clusters. We typically use tools like cobbler + a configuration management tool (Puppet, Chef, and alike) to set up a large number of machines (physical and/or virtual) hands off (hundreds of nodes in less than an hour typical). We would like to do the same for nodes running Jenkins. But the step by step guide doesn't give us any clues in this regard. I did see a Multi-slave config plugin. But, being used to dealing with hundreds or more machines completely hands-off, clicking the UI for many machines just doesn't feel right. Can someone point to us a reference that talks about how to set up large cluster of Jenkins CI hosts more in the hands-off way?

    Read the article

  • Why can't I debug my ASP project through a remote desktop connection?

    - by Anthony Benavente
    I just asked this question in Stack Overflow but I figured this stack exchange forum is a better fit. It's been about a month of trying to figure out this problem and we've still not found a solution. We have about seven virtual machines on a server running Windows XP Professional w/ SP 3 all with Visual Studio Interdev and IIS 5.1 installed. Running the programs all work fine, but we just can't debug through remote desktop. When we are logged into the server console (through VM Sphere) and log into one of the virtual machines through there, we are able to debug properly. We figured the issue lies with some kind of permissions for Remote Desktop Users. We've tried nearly every article on the internet (exaggerating of course) and are about to give up hope. One more thing, when we are logged into the virtual machine through the server console and then remote in, the user that was logged into the console is kicked off but debugging works! Does remoting in trick the computer into giving us the correct permissions? I'm really not sure how it works. I know that this technology predates human history, but we are in the process of migrating from ASP Classic to ASP.NET Specs: - Windows XP Professional W/ SP3 - IIS 5.1 - Visual Studio 6 Interdev EDIT: By "debug" I mean running the project with breakpoints. Interdev doesn't stop at breakpoints.

    Read the article

  • Reliable router with good VPN and WAN Throughput [closed]

    - by Asdande
    I have 2 cisco rv180 VPN router. These routers are giving me lots of problems. The webpages wont load correctly, slow response to load webpages plus other many issues. I have several cases pending with cisco. I give up on these routers. I would like to know if you guys can recommend me a reliable router for our 3 branches (NY - main, SC and FL). In NY- main office, we have 55 users. In SC branch, 6 users. In Florida we only have 1 (will grow soon). I need a router capable of support: 3 VPNs Site-to-Site connection VPN throughput of at least 40-50 Mbps WAN throughput at least 100 Mpbs and up PPTP Server for at least 5 PPTP users Web filtering - all users need access to internet Good Firewall Port forwarding for FTP Server - able to show the public IPs of FTP users (rv180 cannot do that, just shows me router's LAN interface IP, opened a case with cisco, now escaleted to level 2, still no answer or workaround) Dual WAN ports for balance or backup internet. Gigabit WAN/LAN ports Price between $400-$500 range. I was thinking on the TP-LINK TL-ER6120 or TL-ER6020 according to the review on smallnetbuilder.com http://www.smallnetbuilder.com/lanwan/lanwan-reviews/31983-tp-link-tl-er6020-safestream-gigabit-dual-wan-vpn-router-reviewed but I don't want to make another mistake as I did when I bought the cisco RV180. Thank you in advance,

    Read the article

  • PST backup with Volume Shadow Copy Service

    - by NoMadMan
    I was asked to implement the task of backing up 35 PST files ranging from 800Mb to 2000Mb. Windows XP and Windows 2000 workstations are assigned to the users and we have a Windows 2000 domain controller we use to back up files on 3x 500Gb external hard drives. I found several methods from applications to scripts. Local or remote applications would be my last resort. I came across this script based on Volume Shadow Copy Service. CopyWithVss I wanted to know if there would be a problem if the path had spaces. Would mounting the destination path of each PST folder with a drive letter be more practical? My concern with mounting option is that i would eventually run out of letters since I have 35 and possibly more workstations to back up. Lastly, can someone give me an example of CopyWithVss if it were run on a production network? The script is a bit cryptic even after reading several times. Where in the script do I enter the source and the destination? I'm a Mac user so please excuse my ignorance with Windows platform.

    Read the article

  • Booting Ubuntu as VM with KVM on Ubuntu 12.04

    - by CrazycodeMonkey
    I am trying to boot my very first VM using KVM. I have Ubuntu 12.04 installed, i made sure the BIOS had the right virtualization flag enabled for intel processor by running kvm-ok. I have researched this on google and all the instructions that i have found so far are outdated. for e.g. most instructions talk about booting a virtual machine with the following commands qemu-img create -f qcow2 foo.img 100G --- create a virtual disk for your VM kvm --name foo -m 1024 -hda foo.img -cdrom whatever.iso -boot d -- This runs kvm. This command line is incomplete. First you need to be root to run this. Second- it is missing option for the video device. When you run this command you get the following error "Could not initialize SDL(No available video device) - exiting" Googled this error and looked it up on stackover flow http://stackoverflow.com/questions/4841908/sdl-init-failure-reason-is-no-available-video-device The answer provided here does not work on Ubuntu 12.04 Googled this problem further and found out that i need to specify a video device so I finally ran the following command sudo kvm --name mymachine -m 8096 -hda myimage.img --cdrom ubuntu.iso -boot d -vga cirruss -k en-us -vmc :0 This was after I had created the myimage.img image on the drive. Now this command does not give me an error but it just hangs. Does anyone have clear instructions on how to run a VM using KVM on Ubuntu?

    Read the article

  • Office Compatibility Pack and File Permissions

    - by hymie
    MS isn't my thing, so I hope somebody can give me a pointer. We have a Windows domain, with a Server-2003-SP1-Enterprise file server. One of the specific files is a MS Excel 2007 (XLSX) file created by user LK. In the "Security" preferences setting, about a half-dozen users (including me) have access to this file. LK is the owner and has "full control", while the rest of us have "Read" , "Read & Execute", and "Write" permission. LK is also the owner of the directory that this file resides in. I don't know if that's relevant. So far so good. My desktop machine has Windows XP SP3 , and Excel 2003 SP3 , and the "Office Compatibility Pack" which lets me read and write the new XLSX files. However, whenever I write the file, the permissions are changed. The newly-written file only has permissions for LK and me, and both are "Full control" So in short, what am I doing wrong, and how should I set this up to do it right, keeping the permissions on the file that were there when I started?

    Read the article

  • ubuntu preseed installation keep missing mirror files

    - by JackWu
    Install ubuntu12.04.2 with preseed file, but there is one buggy problem about preseed mirror setting. The symptom here is installing process got stuck. So I track down the log file, and find out the real problem, the installation is looking for a file that's not there. This is just one of them, another pops up if I faked this file. This all happened during preseed, so I believe preseed has something to do with this. I google ubuntu preseed mirror and find this post saying: # If you select ftp, the mirror/country string does not need to be set. #d-i mirror/protocol string ftp d-i mirror/country string manual d-i mirror/http/hostname string archive.ubuntu.com d-i mirror/http/directory string /ubuntu d-i mirror/http/proxy string # Alternatively: by default, the installer uses CC.archive.ubuntu.com where # CC is the ISO-3166-2 code for the selected country. You can preseed this # so that it does so without asking. #d-i mirror/http/mirror select CC.archive.ubuntu.com # Suite to install. #d-i mirror/suite string lucid # Suite to use for loading installer components (optional). #d-i mirror/udeb/suite string lucid # Components to use for loading installer components (optional). #d-i mirror/udeb/components multiselect main, restricted I wonder the difference between d-i mirror/http/hostname and d-i mirror/http/mirror, I mean they all specify a mirror, right? In my preseed file, this is no d-i mirror/http/mirror, and d-i mirror/http/hostname points to my own repo as you might notice in the previous image. Here is my question: Does preseed fetches file/resource from internet, if I use local repo? Why it's looking for file that's not even there? This has bothered for quite time, many thanks in advance to anyone who might give any help.

    Read the article

< Previous Page | 501 502 503 504 505 506 507 508 509 510 511 512  | Next Page >