Search Results

Search found 19157 results on 767 pages for 'shared folder'.

Page 347/767 | < Previous Page | 343 344 345 346 347 348 349 350 351 352 353 354  | Next Page >

  • 2011 i7 Macbook Pro unable to boot from any Windows CD?

    - by Craig Otis
    I'm encountering issues installing Windows alongside my Lion install. I'm attempting to install from the internal SuperDrive, after using Boot Camp to partition what was a single, HFS+ volume. When holding down Option at boot, the CD appears in the startup list, but upon selecting it, I get a gray screen for 5 minutes, then a flashing white folder. I tried installing rEFIt and using this to boot the CD, but I receive an error about "Not Found" being returned from the "LocateDevicePath", and a mention of the firmware not supporting booting using legacy methods. In the Console, when opening the StartupDisk preference pane (which never presents the CD as a selectable option), I see: 11/25/11 4:39:31.159 PM System Preferences: isCDROM: 0 isDVDROM:1 11/25/11 4:39:31.159 PM System Preferences: mountable disk appeared: /Volumes/GRMCPRFRER_EN_DVD 11/25/11 4:39:33.214 PM System Preferences: - So far so good, passing disk to System Searcher. 11/25/11 4:39:33.218 PM System Preferences: OSXCheck: No boot.efi in System Folder or volume root. 11/25/11 4:39:33.220 PM System Preferences: WinCheck: Not a valid windows filesystem: /Volumes/GRMCPRFRER_EN_DVD 11/25/11 4:39:33.220 PM System Preferences: WinCheck: Not a valid windows filesystem: /Volumes/GRMCPRFRER_EN_DVD I'm at a loss here. I've done my research, but it sounds like most of the rEFIt errors of this nature are caused by installing from a thumbdrive, or an external drive. I'm using the internal SuperDrive. Also, I've tried this with two different disks: A Windows XP SP2 CD A Windows 7 x86 DVD Both are disks I've had around for years, and I've used them reliably in the past. The system is an early 2011 15" Macbook Pro, all firmware updates installed.

    Read the article

  • WS2008 subst in Logon script does not "stick"

    - by Frans
    I have a terminal server environment exclusively with Windows Server 2008. My problem is that I need to "map" a drive letter to each users Temp folder. This is due to a legacy app that requries a separate Temp folder for each user but which does not understand %temp%. So, just add "subst t: %temp%" to the logon script, right? The problem is that, even though the command runs, the subst doesn't "stick" and the user doesn't get a T: drive. Here is what I have tried; The simplest version: 'Mapping a temp drive Set WinShell = WScript.CreateObject("WScript.Shell") WinShell.Run "subst T: %temp%", 2, True That didn't work, so tried this for more debug information: 'Mapping a temp drive Set WinShell = WScript.CreateObject("WScript.Shell") Set procEnv = WinShell.Environment("Process") wscript.echo(procEnv("TEMP")) tempDir = procEnv("TEMP") WinShell.Run "subst T: " & tempDir, 3, True This shows me the correct temp path when the user logs in - but still no T: Drive. Decided to resort to brute force and put this in my login script: 'Mapping a temp drive Set WinShell = WScript.CreateObject("WScript.Shell") WinShell.Run "\\domain\sysvol\esl.hosted\scripts\tempdir.cmd", 3, True where \domain\sysvol\esl.hosted\scripts\tempdir.cmd has this content: echo on subst t: %temp% pause When I log in with the above then the command window opens up and I can see the subst command being executed correctly, with the correct path. But still no T: drive. I have tried running all of the above scripts outside of a login script and they always work perfectly - this problem only occurs when doing it from inside a login script. I found a passing reference on an MSFN forum about a similar problem when the user is already logged on to another machine - but I have this problem even without being logged on to another machine. Any suggestion on how to overcome this will be much appreciated.

    Read the article

  • Cannot write samba shares

    - by Batsu
    Running samba 3.5 on Red Hat Enterprise 6.1 I'm having issues sharing two folders. Here is the output of testparm: [global] workgroup = DOMAINNAME server string = Samba Server Version %v interfaces = lo, eth1 bind interfaces only = Yes map to guest = Bad User log file = /var/log/samba/log.%m max log size = 50 idmap uid = 16777216-33554431 idmap gid = 16777216-33554431 hosts allow = 10.50.183.48, 10.50.184.41, 10.50.184.199, 10.50.183.160, 127.0.0.1 hosts deny = 0.0.0.0/0 cups options = raw [test] comment = test folder path = /usr/local/samba valid users = claudio write list = claudio force user = claudio read only = No create mask = 0775 directory mask = 0775 [test2] comment = another test path = /home/claudio/tst valid users = claudio write list = claudio force user = claudio read only = No create mask = 0775 From the Windows XP machine I'm connecting from I'm able to read test but not write, while for test2 I can't even access the folder (though I can see it listed). ls -l /usr/local ... drwxrwxrwx. 2 claudio claudio 4096 Dec 3 10:39 samba ... ls -l /user/local/samba total 32 -rwxrwxrwx. 1 claudio claudio 9 Nov 29 16:26 asd.txt -rwxrwxrwx. 1 claudio claudio 728 Dec 3 10:16 out.txt ... ls -l /home/claudio/ ... drwxrwxr-x. 2 claudio claudio 4096 Dec 3 09:57 tst ... ls -l /home/claudio/tst total 4 -rw-rw-r--. 1 claudio claudio 4 Dec 3 09:57 asd.txt Any suggestion?

    Read the article

  • Extract a section of a tgz file

    - by TRiG
    I have a 28.5 GB .tgz file which was created on the command line of a Linux computer, compressing one folder and all its many many subfolders. I now want to extract a single sub-sub folder from that .tgz file, using 7zip on Windows Vista. I can't see a way to do it. Opening the .tgz file in 7zip just shows the .tar file inside it. There doesn't seem to be any way to browse that .tar file and extract the section I want. I assume there is a way to do this, but I can't see it. Simply double-clicking on the .tar file brings up a progress bar which runs slowly till my computer complains it's running out of space; I imagine it's trying to extract the whole thing. Searching for "extract section of tgz" and "extract tgz subfolder" and similar found me a way to do it on the Linux command line, but no obvious way to do it on Windows. (Most results found were about extracting into a subfolder, not extracting a subfolder out of the archive.)

    Read the article

  • Where's my memory?! Nginx + PHP-FPM front end webserver slows to a crawl...

    - by incredimike
    I'm not sure if I have a problem with a memory leak (as my hosting company suggests), or if we both need to read http://linuxatemyram.com. Maybe you clever people can help us out? This is a front-end webserver VM running essentially only nginx & php-fpm on RHEL 5.5. This server is powering Magento, a PHP eCommerce thinggy. The server is running in a shared environment, but we're changing that soon. Anyway.. after a reboot the server runs just fine, but within a day it will grind itself into nothingness. Pages will take literally 2 minutes to load, CPU spikes like crazy, etc.. The console is even sluggish when I SSH in. It's like my whole server is being brought to its knees. I've also been monitoring the DB server via top and tcpdumping incoming traffic. The DB stays idle for a good portion of that "slow" load time. When i start seeing queries coming from the front-end server, the page loads soon afterward. Here are some stats after me logging in during a slow-down, after restarting php-fpm: [mike@front01 ~]$ free -m total used free shared buffers cached Mem: 5963 5217 745 0 192 314 -/+ buffers/cache: 4711 1252 Swap: 4047 4 4042 [mike@front01 ~]$ top top - 11:38:55 up 2 days, 1:01, 3 users, load average: 0.06, 0.17, 0.21 Tasks: 131 total, 1 running, 130 sleeping, 0 stopped, 0 zombie Cpu0 : 0.0%us, 0.3%sy, 0.0%ni, 99.3%id, 0.3%wa, 0.0%hi, 0.0%si, 0.0%st Cpu1 : 0.3%us, 0.0%sy, 0.0%ni, 99.7%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Cpu2 : 0.0%us, 0.0%sy, 0.0%ni,100.0%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Cpu3 : 0.0%us, 0.0%sy, 0.0%ni,100.0%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 6106800k total, 5361288k used, 745512k free, 199960k buffers Swap: 4144728k total, 4976k used, 4139752k free, 328480k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 31806 apache 15 0 601m 120m 37m S 0.0 2.0 0:22.23 php-fpm 31805 apache 15 0 549m 66m 31m S 0.0 1.1 0:14.54 php-fpm 31809 apache 16 0 547m 65m 32m S 0.0 1.1 0:12.84 php-fpm 32285 apache 15 0 546m 63m 33m S 0.0 1.1 0:09.22 php-fpm 32373 apache 15 0 546m 62m 32m S 0.0 1.1 0:09.66 php-fpm 31808 apache 16 0 543m 60m 35m S 0.0 1.0 0:18.93 php-fpm 31807 apache 16 0 533m 49m 30m S 0.0 0.8 0:08.93 php-fpm 32092 apache 15 0 535m 48m 27m S 0.0 0.8 0:06.67 php-fpm 4392 root 18 0 194m 10m 7184 S 0.0 0.2 0:06.96 cvd 4064 root 15 0 154m 8304 4220 S 0.0 0.1 3:55.57 snmpd 4394 root 15 0 119m 5660 2944 S 0.0 0.1 0:02.84 EvMgrC 31804 root 15 0 519m 5180 932 S 0.0 0.1 0:00.46 php-fpm 4138 ntp 15 0 23396 5032 3904 S 0.0 0.1 0:02.38 ntpd 643 nginx 15 0 95276 4408 1524 S 0.0 0.1 0:01.15 nginx 5131 root 16 0 90128 3340 2600 S 0.0 0.1 0:01.41 sshd 28467 root 15 0 90128 3340 2600 S 0.0 0.1 0:00.35 sshd 32602 root 16 0 90128 3332 2600 S 0.0 0.1 0:00.36 sshd 1614 root 16 0 90128 3308 2588 S 0.0 0.1 0:00.02 sshd 2817 root 5 -10 7216 3140 1724 S 0.0 0.1 0:03.80 iscsid 4161 root 15 0 66948 2340 800 S 0.0 0.0 0:10.35 sendmail 1617 nicole 17 0 53876 2000 1516 S 0.0 0.0 0:00.02 sftp-server ... Is there anything else I should be looking at, or any more information that might be useful? I'm just a developer, but the slowdowns on this system worry me and make it hard to do my work.. Help me out, ServerFault!

    Read the article

  • How to set up a PRIVATE vimwiki on Dropbox.com

    - by Zongheng Yang
    Hi everyone, I assume those who are reading this page know what vimwiki and dropbox.com are and what they are for, so I might directly go into my confusion. The common way of setting a PRIVATE vimwiki on dropbox is simply let your vimwiki directories be under Dropbox folder (but not Dropbox/Public/ because it would be PUBLIC). Dropbox allows directly viewing html with dropbox.com/* url: for example a index.html can be accessed by url https://dl-web.dropbox.com/get/Wiki/html/index.html?w=bfead71a, being added after the file name a specified string, ?w=bfead71a. Hence, if inside index.html there is reference to A.html, which is located in the same folder index.html is in, it has to be accessed using some url like https://dl-web.dropbox.com/get/Wiki/html/index.html?w=SPECIFIED_STRING. But it is seemingly impossible to hack vimwiki in order to make the hrefs in converted htmls corrected in this way. Is there some approach that can resolve this problem? I hope I make myself clear. Had you any questions, please ask me for further explanations. Thank you!

    Read the article

  • Can't open cocoa emacs from terminal using open -a

    - by Shane
    I installed emacs on my macbook air running os x 10.6.5 from this site http://emacsformacosx.com/. I believe this is what used to be called cocoa emacs. I dragged it into my Application folder and it works fine when i run it from there. I want to be able to run it from terminal. After some googling, i tried open -a /Application/Emacs.app foo.txt (foo.txt was and existing file). I got two emacs windows - one with welcome screen and one with foo.txt loaded. I tried a few applications in the /Applications directory and they did not seem to behave like this. I had installed it using my own account (an admin account) so after doing ls -l on /Application I noticed that the owner and group were different from the other entries in this folder. I recursively changed the owner and group to root and wheel, like the others, but this did not help. The only thing that looks funny now is that there that ls -l show a @ character which has something to do with extended attributes but i don't know how to check these. Any suggestions on what to check next? Is using the open command the only to run the program? Can I simulate what it does using a shell script?

    Read the article

  • Copy past speed very slow for a large number of tiny files on Windows but not on linux

    - by Arno2501
    I've got this folder which contains 15'000 of tiny images (around 400 bytes each). If I copy past this folder on my laptop (Windows 7, i7 latest gen, superfast ssd) it takes about 30 seconds (yes for 7 megs !!!) the average transfer rate is 400 KBytes / second which is so slow. I mean my usual transfer rate is more like hundreds of MBytes per second !!! I get the same problem on my servers (Windows 2003, 2008 /r2) and on every Windows box that I could get my hands on. On the other hand if I do the same on a linux box (debian base, Ext3 FS) (which runs on the same SAN than all the windows servers I've tested) It's nearly instantaneous !!! I'm pretty sure the size / number of the files may stress such filesystem more than another but such differences !? Why is that ? Why is it so slow on the windows boxes (more that 30 sec for 7 MB) and so fast on the linux ones (one sec or so) (I mean this was not a hardlink that I've created it was a true copy). Is it a normal behaviour or something unusual ?

    Read the article

  • Set up multiple websites on a local web server

    - by mickburkejnr
    I have spent the last few days setting up a CentOS 6 server on my local network so that I can host multiple projects that I'm currently working on. Everything has been set up so that I access the server by typing 192.168.1.10 and the Apache test page comes up. What I'm aiming to do is to access different projects by typing in 192.168.1.10/project, and then view the project as if it was on it's own standalone server. I have thought about just sticking these sites inside folders on the server then accessing them that way, but a lot of my projects use CakePHP so this isn't feasible. So what I need to do is create VirtualHosts in Apache to allow me to do this, but without using a domain name. I want to stick to using the IP address of the machine (which is static). Any ideas? EDIT I've followed Peter's suggestion, but now I have a new problem. In the httpd.conf file I have entered the following information: NameVirtualHost *:80 <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /www/html/project1 ServerName local.project1.com ErrorLog logs/local.project1.com-error_log CustomLog logs/local.project1.com-access_log common </VirtualHost> And now Apache is saying: Starting httpd: Warning: DocumentRoot [/www/html/project1] does not exist When it clearly does exist. I've disabled SELinux and I can confirm this isn't turned on. I've also checked the ownership of the folder, and its owned by root. I can also save files to these folders using a guest FTP account (which isn't associated to root), so the folders are being listed and can be written to. But when I try the folder in a web browser it doesn't seem to work either. I've also done a reboot of the server and the problem persists. What should I change in order to resolve this?

    Read the article

  • Access denied to external USB disk; update access rights fails in Windows 8

    - by gerard
    I use to work with 2 laptops (Windows vista and Windows 7), my work files being on an external usb disk. My oldest laptop broke down, so I bought a new one. I had no option other than take Windows 8. I suspect something changed with access rights, as my external disk suffered some "access denied" problem on Windows. I was prompted (by Windows 8) somehow to fix the access rights, which I tried to do, getting to the properties - security. This process was very slow and ended up saying disk is not ready Additionally, my external usb disk somehow was not recognized anymore. Back to Windows 7, I was warned that my disk needed to be verified, which I did. In this process, some files were lost (most of them I could recover from the folder found00x, but I have some backup anyway). Also, I don't know why, but under Windows 7, all the folder showed with a lock. Then back again to Windows 8. Same problem : access denied to my disk + no way to change access rights as it gets stuck disk is not ready". Now I am pretty sure there is some kind of bug or inconsistency in Windows 8 / Windows 7. I did 2. and 3. a few times. At some point, I also got an access denied in Windows 7. I could restore access rights to the disk to "System" (properties - security - EDIT for full control to group "system". ). But then I still get the same access right pb on Windows 8, and getting stuck in the process to restore full control to "system" -- and "admin" groups. I upgraded Windows8 with the Windows8 updates available. Does not help.

    Read the article

  • Exchange 2003 -- Mailbox Management not deleting ALL messages aged 30 days or older...

    - by tcv
    I've recently created a Mailbox Management task within Exchange 2003 that, every night, looks at the contents of the Deleted Items within a particular mailbox and deletes mail that's 30 days or older. The scheduled task ran on its own last night and I have confirmed that messages within the right mailbox and the right folder were, in fact, processed. Many mails were deleted ... but not never email older than 30 days. In fact, the choice seems kinda random. Last night 3/10/2010 was the 30 day watermark. Mails were deleted from 3/10/2010, sure enough, but not all of them. Mails older than 3/10/2010 were deleted as well, but, again, not all of them. The only criteria I have on the management -- aside from the single mailbox and single folder scopes -- is the age criteria. The size criteria is set to Any, meaning I don't care about the size. I care about the age. It's made me wonder where there is some sort of limit on how many mails can be processed? The schedule is set for 12am and 1am every night. Any hints appreciated.

    Read the article

  • Configure Apache + Passenger to serve static files from different directory

    - by Rory Fitzpatrick
    I'm trying to setup Apache and Passenger to serve a Rails app. However, I also need it to serve static files from a directory other than /public and give precedence to these static files over anything in the Rails app. The Rails app is in /home/user/apps/testapp and the static files in /home/user/public_html. For various reasons the static files cannot simply be moved to the Rails public folder. Also note that the root http://domain.com/ should be served by the index.html file in the public_html folder. Here is the config I'm using: <VirtualHost *:80> ServerName domain.com DocumentRoot /home/user/apps/testapp/public RewriteEngine On RewriteCond /home/user/public_html/%{REQUEST_FILENAME} -f RewriteCond /home/user/public_html/%{REQUEST_FILENAME} -d RewriteRule ^/(.*)$ /home/user/public_html/$1 [L] </VirtualHost> This serves the Rails application fine but gives 404 for any static content from public_html. I have also tried a configuration that uses DocumentRoot /home/user/public_html but this doesn't serve the Rails app at all, presumably because Passenger doesn't know to process the request. Interestingly, if I change the conditions to !-f and !-d and the rewrite rule to redirecto to another domain, it works as expected (e.g. http://domain.com/doesnt_exist gets redirected to http://otherdomain.com/doesnt_exist) How can I configure Apache to serve static files like this, but allow all other requests to continue to Passenger?

    Read the article

  • How do I push my initial snapshot to a subscriber server in SQL Server 2000?

    - by Kev
    I'm configuring Transactional Replication using the Push model. The scenario is: The SQL Servers: SQL01 (publisher) and SQL02 (subscriber) - both running SQL 2000 SP4. Both servers are standalone (i.e. not domain members) Both servers have their FQDN and NETBIOS names in their HOSTS files I've managed to configure SQL01 to publish my database and configured a Push subscription for SQL02 using the Push New Subscription wizard and set the Distribution Agent to update the subscription continuously. On the Push Subscription wizard "Initialise Subscription" page I've selected "Yes, initialise the schema and data" and ticked the "Start the Snapshot Agent to begin the initialisation process immediately" option. All the required services are running (SQL Agent). When I complete the wizard and browse the Replication - Publications folder I can see my publication (blue book with arrow). The publication shows the Push subscription and its status is Pending. If I look in the c:\Program Files\Microsoft SQL Server\Mssql\Repldata folder I see a number of T-SQL scripts for each table e.g. Products.bcp, Products.sch, Products.idx. What should happen now? Should my replicated database now (magically) appear on the subscription server?

    Read the article

  • best practices for setting up a new windows 2008 R2 server with ec2 AWS

    - by Alex
    Can someone comment what they would add to the following list of SOP in terms of best practices? This is being set up on AWS, and then after further testing, back in our datacenter. Standard Operation Procedure (SOP): Installation Part: 2 - Installation of Software Components in Windows 2008 R2 (Updated). Step: 1 Logon to the host through Remote Desktop. Strp: 2 Open Server Manager - Server Roles - Install Web Server IIS 7.5 with compatible of IIS 6 features and Management compatibility mode. Step: 3 Open IE/Mozilla to Download the below listed software's and save all installation files to folder called "AWS Server Install Files" for future reference.. Net Framework 2.0 (Download that from internet) Crystal reports for .Net Framework 2.0 (x64) (Download that from internet) SQL Server 2005 (AWS Image) Step: 4 Once all software's saved on local drive, then Install it one by one. Step: 5 Navigate to Desktop folder to install the below listed softwares. Microsoft Asp.net 2.0 AjaxExtention 1.0 (placed on Desktop \Softwares) WebEx recorder. (placed on Desktop \Softwares) Winrar(placed on Desktop \Softwares) Step: 6 Make sure all the software are working fine. Step: 7 Inspect the server once entirely. Step: 8 Logoff & Stop the Instance.

    Read the article

  • Improving performance by using an additional static file server

    - by Max
    Hello there, I´m planning for a large website that includes many static assets (js, css, images and thumbnails) in the generated pages. That website will use TYPO3 as CMS (is is a customer requirement). I guess I could seriously improve performance / page load times by using a two server setup. One server where the main application (PHP) runs and another one where the static files sit being served by a trimmed down version of apache or something like lighthttpd. Including e. g. js or css files from the file server is of course no big deal. Just use an absolute url http://static.example.com/js/main.js and be done with it. But: that website will have pages with MANY thumbnails of e. g. product images on it. So I see two problems when the main application tries to create a thumbnail of some image: the original image like products/some.jpg is uploaded on the static file server and therefore not on the same server as the PHP application which tries to create the thumbnail. TYPO3 writes created thumbnails to a temp directory which is expected to be on the same server. Therefore, hundreds of thumbnails will be written and served from that temp directory which is on the same server as the main application - the static file server is in that case basically useless, all thumbnails will be requested from the server of the main application. So, my question is: how to overcome this shortcomings? Is it possible to "symlink" some directories to another server? So, for example, if PHP tries to open the original products image for thumbnail creation with imagecreate("products/some.jpg") the products folder actually "points" to the products folder on the static image server? I know something like this can be done with .htaccess but is it possible on file system level?

    Read the article

  • How to remove NTFS system files from a previous Vista installation

    - by Boldewyn
    I'm trying to shrink my system partition under Win Vista. It's all fine, except that in front of the last 300MB of the volume sits a single file, that cannot be moved by defrag or other means from its position. It's called C:\$Extend\$UsnJrnl:$J, and my assumtion is, that it is left from a previous installation of Vista, when I re-set up the system. Now, googling for this kind of files brings interesting results, but no solution to my problem: Files left on the disk can become ownerless in a new setup of Windows and inaccessible (even for administrators). To be able to access them again, I found the tip to use takeown to re-assign them to the Admin group (or anyone else). Works like a charm for normal files, but not for the C:\$Extend stuff. The C:\$Extend folder is a system folder of the NTFS file system, where the journal is stored (especially in a file called $UsnJrnl:$Data, whose name is surprisingly close to mine). You can delete the journal with fsutil usn /delete C:, however, this doesn't work from within the booted system (as I found out trying). Also, I'm not quite sure of the side effects. You can't move the NTFS own files with standard defrag tools. The same holds, by the way, for not accessible files. Every bit of knowledge out there is targeted to either not accessible files or the $Extend NTFS stuff, but noone addresses my problem involving both, an inaccessible system file. Question: How can I remove this file, or at least how can I move it on the disk?

    Read the article

  • Synchronize folders on different computers without cloud and without network just internet

    - by theimmortalbg
    I have two computers with windows 7, one in my home town and one in another town. So they are not in private network but I have internet on both. They have exactly the same file structure. I am searching for program that can keep the data equal. I know about dropbox or google drive but they are cloud and I don't want to use them. Also they are using folder that you should copy your data in it. There is another programs that are like a server, just put something and after that you can download it but I dont need them. I want just to point which folders to be synchronized and the program make the synchronization. The sync can be in real time if the two computers are powered, or after few time when they are powered. Or it can lock another computer synced folder till update is required. At all this is my documents that I want to be synced in all my computers and to be changed from where I want. In fact I can move the updates with flash but if some program save the changes and make them on another computer with one click it will facilitate my work.

    Read the article

  • How do I share a complete XP disk so it can be seen from a Windows 7 system? (To move all files to a

    - by Ian Ringrose
    This should be easier! (both computers can see the internet etc so I know the network it’s self is working) I have a normal home network with a Windows XP machine on it and the new Windows 7 (64 bit) machine. So I can transfer the files to the new Windows 7 machine, I wish to share the complete disk (and all files) from the Windows XP machine and access them from the Windows 7 machine. Is there a step by step set of instructions for doing this anywhere? So fare I have: put both computers into the same workgroup put the windows 7 machine into work network mode so it can see the XP machine in the work group shared the XP disk as read only But when I try to access a lot of the folders on the XP disks, I am told I am not allowed to access them. (I was not asked for any passwords by the windows 7 machine when I accessed the XP machine. The XP machine just has its default account with no password set on it) The XP machine runs XP home and hence has "simple file shairing" turn on. So it seems that even if I create a admin account (with password) and connect with that account, it still comes in as "guest" on the XP machine. Chooseing to share the folder I want access to rather then the top of the disk drive seems to work, but is a pain as I need to share each user's folder with a different share name. If the new computer was not a laptop, I would just plug the hard disk from the old machine into it, but being a laptop I don't have that option.

    Read the article

  • make pentaho report for ubuntu 11.04

    - by Hendri
    currently i'm trying to install Pentaho Reports for OpenERP which is refer from https://github.com/WillowIT/Pentaho-...rver/build.xml i ever installed on some laptops which is Windows Based and it's working, but currently i'm trying on UBuntu 11.04 OS, it prompted me error like this "error build.xml:18: failed to create task or type.." below is the steps i did : 1. install java-6-openjdk comment : "apt-get install java-6-openjdk" then i set installed java jdk into java_home environment command: "nano /etc/environment" add environment with this new line : JAVA_HOME="/usr/lib/jvm/java-6-openjdk" I install apache ant command : "apt-get install ant" followed by setting the evnironment command: "nano /etc/environment" add environment with this new line : ANT_HOME="/usr/share/ant" try to check installation with command "ant"... I get message like this: Buildfile: build.xml does not exist! Build failed then download java server from https://github.com/WillowIT/Pentaho-...rver/build.xml and then copied to Ubuntu share folder and then on command form, i goto extracted path which is share folder i mentioned and executed command "ant war " and i got error message : BUILD FAILED /share/java_server/build.xml:18: problem: failed to create task or type antlibrg:apacge.ivy.ant:retrieve cause: The name is undefined. Action:Check the spelling. Action:Check that any custom taks/types have been declared Action:Check that any /declarations have taken place. No types or taks have been defined in this namespace yet This appears to be an antlib declaration. Action:Check that the implementing library exists in one of: -/usr/share/ant/lib -/root/.ant/lib -a directory added on the command line with the -lib argument Total time:0 seconds is there any compability issue? or i miss out some steps? i'm in the some project to rush with for reporting, so please help me to solve this issue i look forward to your corporation to help me solve this issue, thanks a lot in advance Thx Best Regards,

    Read the article

  • For Windows XP: How to uninstall Broadcom Bluetooth software that won't disappear

    - by T. Webster
    Hello, my situation is similar to this question for Windows 7, except my OS is Windows XP SP3. I have recently realized I made the mistake of buying a Bluetooth adapter and installing the Broadcom/Widcomm Bluetooth stack driver software. Now that I know that the software is no good, I want to install it (so I can install the Toshiba stack for the Cirago adapter). I've attempted to uninstall all Bluetooth driver software in Device Manager, and I don't see any remnants of any Bluetooth drivers there. I would include pictures here, but I don't have 10 reputation yet, so I'll just use links instead. picasaweb.google.com/lh/photo/2NAos6BP9UigPtJKaYRGyQ?feat=directlink But I do see that the little Bluetooth icon still persists: picasaweb.google.com/lh/photo/kbxgPP8ZLxhjX4ob2YjSew?feat=directlink I don't see anything about Bluetooth, Broadcom, or Widcomm in Add/Remove Programs. I don't see any folder names Broadcom or Widcomm in Program Files folder, either. But I do see that Broadcom does show up in the registry with respect to Bluetooth, as shown here. picasaweb.google.com/lh/photo/hWDo-OzB8ApibzeHA_saxQ?feat=directlink What should I do now to completely wipe this persistent Broadcom Bluetooth software off my computer?

    Read the article

  • Empty rewrite.log on Windows, RewriteLogLevel is in httpd.conf

    - by ripper234
    I am using mod_rewrite on Apache 2.2, Windows 7, and it is working ... except I don't see any logging information. I added these lines to the end of my httpd.conf: RewriteLog "c:\wamp\logs\rewrite.log" RewriteLogLevel 9 The log file is created when Apache starts (so it's not a permission problem), but it remains empty. I thought there might be a conflicting RewriteLogLevel statement somewhere, but I checked and there isn't. What else could cause this? Could this be caused by Apache not flushing the log file? (I closed it by hitting CTRL-C on the httpd.exe command ... this caused the access logs to be flushed to disk, but still nothing in rewrite.log) My (partial) httpd-vhosts.conf: <VirtualHost *:80> ServerAdmin webmaster@localhost ServerName my.domain.com DocumentRoot c:\wamp\www\folder <Directory c:\wamp\www\folder> Options -Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule . everything-redirects-to-this.php [L] </IfModule> </Directory> </VirtualHost>

    Read the article

  • CIFS Mounting Permissions

    - by malco
    I have an issue that I;m going round in circles with, I hope you can help. The Set up: Server 1 (CIFS Client) - CentOS 6.3 AD integrated uing Samba/Winbind & idmap_ad Server 2 (CIFS Server) - CentOS 6.3 AD integrated uing Samba/Winbind & idmap_ad All users (apart from root) are AD authenticated and this, including groups, etc works happily. What's working: I have created a share on Server 2: [share2] path = /srv/samba/share2 writeable = yes Permissions on the share: drwxrwx---. 2 root domain users 4096 Oct 12 09:21 share2 I can log into a Windows machine as user5 (member of domain users) and everything works as it should, for example: If I create a file it shows the correct permissions and attributes on both the MS and the Linux sides. Where I Fall Down: I mount the share on Server 1 using: # mount //server2/share2 /mnt/share2/ -o username=cifsmount,password=blah,domain=blah Or using fstab: //server2/share2 /mnt/share2 cifs credentials=/blah/.creds 0 0 This mounts fine, but.... If I log su, or log onto server 1 as a normal user (say user5) and try to create a file I get: #touch test touch test touch: cannot touch `test': Permission denied Then if I check the folder the file was created but as the cifsmount user: -rw-r--r--. 1 cifsmount domain users 0 Oct 12 09:21 test I can rename, delete, move or copy stuff around as user5, I just can't create anything, what am I doing wrong? I'm guessing it's something to do with the mount action as when I log onto server2 as user5 and access the folder locally it all works as it should. Can anyone point me in the right direction?

    Read the article

  • How to migrate Outlook Express mail rules?

    - by ronwest
    I have a home computer that only had a 15Gb C: drive, and ran out of space with all the Microsoft Updates, etc, that keep coming down. So I fitted a 160Gb drive as a C: drive and altered the drive jumpers to make the old C: drive into a slave D: drive, to save migrating documents, etc. I've installed a clean copy of Windows XP SP3 and reassigned the new Outlook Express' mailstore path to point to the old mailstore folder that now has a D: drive letter - and it all works OK. However, my extensive list of mail rules have not been transferred to the new OE and I have not been able to identify how they are stored. To find it I added a new rule to the new OE, exited OE, then searched on the whole computer (including hidden/system files) for files altered around the time I added the rule. I hoped I could just overwrite a new empty file with an old one. But the only files that seem to be changed are Windows system-level files and some bits and pieces in the Windows\PreFetch sub-folder. None of them can be opened as XP has them locked, and none of them have names that are anything to do with email or rules. Does anyone know of any way of migrating OE rules, or do I have to re-enter them by hand? Thanks!

    Read the article

  • Creating "save as" functionality to .eml file in AppleScript

    - by unieater
    I'm new to AppleScript and trying to figure out how to save a Mail.app message as an .eml message. Ideally, I would like it to act similar to the Mail menu bar actions, in which it saves the message and the attachments together. The workflow is that you have a selection in Mail, hit a hotkey and the function creates a filename (newFile) for the email to be saved. I just need help with how to save the message to the path (theFolder) in an .eml format. tell application "Mail" set msgs to selection if length of msgs is not 0 then display dialog "Export selected message(s)?" if the button returned of the result is "OK" then set theFolder to choose folder with prompt "Save Exported Messages to..." without invisibles repeat with msg in msgs -- determine date received of msg and put into YYYYMMDD format set msgDate to date received of msg -- parse date SEMversion below using proc pad2() set {year:y, month:m, day:d, hours:h, minutes:min} to (msgDate) set msgDate to ("" & y & my pad2(m as integer) & my pad2(d)) -- assign subject of msg set msgSubject to (subject of msg) -- create filename.eml to be use as title saved set newFile to (msgDate & "_" & msgSubject & ".eml") as Unicode text -- copy mail message to the folder and prepend date-time to file name -- THIS IS WEHRE I AM COMPLETE LOST HOW SAVE THE EMAIL into theFolder end repeat beep 2 display dialog "Done exporting " & length of msgs & " messages." end if -- OK to export msgs end if -- msgs > 0 end tell on pad2(n) return text -2 thru -1 of ("00" & n) end pad2

    Read the article

  • Preserve embedded album art when converting from .flac to .ogg

    - by Profpatsch
    I want to convert my archived .flac library to .ogg for daily use. Using find ./ -iname '*.flac' -print0 | xargs -0 -n1 oggenc -q6 on the root music folder and then deleting every .flac (having copies of them in archive) seems straight forward, after trying it with one file it worked and all of the tags were transfered, too, except for one: Embedded album art! I always prefer emedded covers over folder images, since I have some albums with varying covers. One possible solution is discussed here, but the script only works if the image is already extracted: Embed album art in OGG through command line in linux One possible solution I thought about was extracting album art from every song (not every song has one, though, and some even 2 or 3!), temporarily saving it and then using the script to include it into the finished .ogg. But then I want to increase the number of processes xargs runs simultaniously to save time, so the temp images need to have a distinct name. Is there a (linux) program that knows how to handle this? Or is there a finished script floating around somewhere? It would be nice if oggenc supported adding embedded coverart and it really is a shame, since these two formats should (in theory) share the same tag format. Edit: 15 days and noone even tries to answer. It’s funny, most of my questions don’t get answered. Too hard? Wrong SE site?

    Read the article

< Previous Page | 343 344 345 346 347 348 349 350 351 352 353 354  | Next Page >