Will doing "Run as administrator" on a .BAT file allow regsvr32 enough privlieges to register a DLL or OCX?
The .bat file contains:
regsvr32 -u SomeOCX.ocx
regsvr32 SomeOCX.ocx
Or, does the logged in user need to be an administrator?
We have a Windows server, with Mac OS X 10.5 and 10.6 clients. On the server are some Automator workflows with custom icons stored in external resoruce fork files, like this:
foo.wflow
._foo.wflow
The custom icons are only visible to the 10.5 clients.
I have a local share with my vmware development server and I'm finding that when I create new files via osx, they are created under root instead of jacob.
Which is weird because when I do the connect to server thing I'm explicit about the user being jacob i.e.
afp://[email protected]/
Suggestions?
I have an ubuntu server 11.10 with apache 2.2.20, php 5.3.6 and an installation of Joomla cms. I have used an extra hard disk as my web server storage and mounted it into /data/www/ (I hope it's not where my problem us!).
I've set permission of all files and folders in my web root to 755 and user groups for them is set to [default ubuntu user(in my case radio)]:www-data.
In past days I had serious problems with joomla not showing new uploaded images and other files and also I can't install any extensions. After hours of searching I found out that uploaded files don't have appropriate permission (they are -rw-------) and Joomla application cannot read, copy or move them after upload.
I’m wondering how can I set a default permission so all files that I upload use it?
PS: I’ve tested umask but it did nothing. I think it has nothing to do with my problem.
Looking for a tool that would verify integrity of ALL files on a Windows 7 x64 NTFS disk reliably?
This is for testing of experimental defrag software, so it really needs to be secure and foolproof. I know it will take a long time, there's millions of files on the disk, but safety just cannot be compromised in a situation like this. Freeware solution much preferred.
Can be either Windows software (=inducing pitfalls about files changing due to booting Windows) or a stand alone boot (for example linux boot cd + usb key for storing chksum/metadata).
I have a list of files I need to copy on a Linux system - each file ranges from 10 to 100GB in size.
I only want to copy to the local filesystem. Is there a way to do this in parallel - with multiple processes each responsible for copying a file - in a simple manner?
I can easily write a multithreaded program to do this, but I'm interested in finding out if there's a low level Linux method for doing this.
I am running Ubuntu Desktop 12.04 with nginx 1.2.6. PHP is PHP-FPM 5.4.9.
This is the relevant part of my nginx.conf:
http {
include mime.types;
default_type application/octet-stream;
sendfile on;
root /www
keepalive_timeout 65;
server {
server_name testapp.com;
root /www/app/www/;
index index.php index.html index.htm;
location ~ \.php$ {
fastcgi_intercept_errors on;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
server {
listen 80 default_server;
index index.html index.php;
location ~ \.php$ {
fastcgi_intercept_errors on;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
}
In my hosts file, I redirect 2 domains: testapp.com and test.com to 127.0.0.1.
My web files are all stored in /www.
From the above settings, if I visit test.com/phpinfo.php and test.com/app/www, everything works as expected and I get output from PHP.
However, if I visit testapp.com, I get the dreaded No input file specified. error.
So, at this point, I pull out the log files and have a look:
2012/12/19 16:00:53 [error] 12183#0: *17 FastCGI sent in stderr: "Unable to open primary script: /www/app/www/index.php (No such file or directory)" while reading response header from upstream, client: 127.0.0.1, server: testapp.com, request: "GET / HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "testapp.com"
This baffles me because I have checked again and again and /www/app/www/index.php definitely exists! This is also validated by the fact that test.com/app/www/index.php works which means the file exists and the permissions are correct.
Why is this happening and what are the root causes of things breaking for just the testapp.com v-host?
How to get the total amount of memory used by 32bit applications and 64bit applications from the command line in Windows.
I tried using tasklist /FI "MODULES eq wow64.dll" /FO CSV and then parsing the output and summing. But tasklist just freezes with any command that has something to do with modules (tasklist /m and tasklist /fi "modules eq wow64.dll" freeze).
Are there any alternatives? Or some idea why tasklist freezes.
I hava file names like below
adn_DF9D_20140515_0001.log
adn_DF9D_20140515_0002.log
adn_DF9D_20140515_0003.log
adn_DF9D_20140515_0004.log
adn_DF9D_20140515_0005.log
adn_DF9D_20140515_0006.log
adn_DF9D_20140515_0007.log
i want get the year, Month, day from file name and create directories
Ex: [[ ! -d "$BASE_DIR/$year/$month/$day" ]] && mkdir -p "$BASE_DIR/$year/$month/$day";
How to achieve this and share the ideas/ script appreciate to you
Does JungleDisk use https for file transfers? If so, does this mean a 3rd party cannot intercept content or even file names of files being backed up? (assume JungleDisks encrypt option is not being used)
I'm trying to get nginx to play nice with php-cgi, but it's not quite working how I'd like. I'm using some set variables to allow for dynamic host names--basically anything.local. I know that stuff is working because I can access static files properly, however php files don't work. I get the standard "No input file specified." error which normally occurs when the file doesn't exist, but it definitely does exist and the path is correct because I can access the static files in the same path. It could possibly be a permissions thing, but I'm not sure how that could be an issue. I'm running this on Windows under my own user account, so I think it should have permission unless php-cgi is running under a different user without me telling it to. .
Here's my config;
worker_processes 1;
events {
worker_connections 1024;
}
http {
include mime.types;
default_type application/octet-stream;
sendfile on;
keepalive_timeout 65;
gzip on;
server {
# Listen for HTTP
listen 80;
# Match to local host names.
server_name *.local;
# We need to store a "cleaned" host.
set $no_www $host;
set $no_local $host;
# Strip out www.
if ($host ~* www\.(.*)) {
set $no_www $1;
rewrite ^(.*)$ $scheme://$no_www$1 permanent;
}
# Strip local for directory names.
if ($no_www ~* (.*)\.local) {
set $no_local $1;
}
# Define default path handler.
location / {
root ../Users/Stephen/Documents/Work/$no_local.com/hosts/main/docs;
index index.php index.html index.htm;
# Route non-existent paths through Kohana system router.
try_files $uri $uri/ /index.php?kohana_uri=$request_uri;
}
# pass PHP scripts to FastCGI server listening on 127.0.0.1:9000
location ~ \.php$ {
root ../Users/Stephen/Documents/Work/$no_local.com/hosts/main/docs;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
include fastcgi.conf;
}
# Prevent access to system files.
location ~ /\. {
return 404;
}
location ~* ^/(modules|application|system) {
return 404;
}
}
}
My friend has a Sony Ericsson Xperia (most probably X8) handset which is an Android based smartphone. He is not able to send files through bluetooth but able to receive. I don't exactly remember the version of Android which he is currently having. He has tried to download and install the required update using the same phone (he has internet connected) from http://www.sonyericsson.com but not able to install it. So, he asked me to help him. Which version of Android do you think he has? Does it seem like he does not have Android 2.1 installed. It is written here (click on Xperia X8) that
If your phone already has Android 2.1,
you can use your mobile network* or a
WiFi connection to download the
software.
Is it possible that he does not have updated Android and so not able to download? If so, does he need to upgrade to Android 2.1 first? Should it be done by connecting it to a PC?
I'm having an issue installing KB973685 via Windows Update and manually. I receive the error below when trying to install it manually:
Error opening installation log file.
Verify that the installation log file
exists and is writable.
Any ideas how to resolve this?
I have started using ack which is much faster than grep. However using ack I want to search for file name rather than file contents. Is there a way to do that?
This is very similar to Question 326211, but in this case, the LAN is an unstable Wi-Fi connection.
I need to transfer about 11 GiB of files between two computers, both running Linux (although one may be rebooted into Windows.) Their connection is both slow and unstable (due to Linux's awful Wi-Fi support,) but removable media (such as a flash drive or external hard drive) is not an option at this time.
Right now, I'm slowly transferring the files, one by one, across SFTP, but I have to reconnect each computer approximately every 90 seconds, and the computers are not very close to each other, so this is not feasible.
This is not a duplicate of Question 30186; that one specifically concerns Windows 7, and all the proposed solutions involve closed-source, Windows-only programs (which are all spyware IMHO, and are all off the table even if I trusted them - one of the computers is Linux-only.)
Hey All,
I like to watch windows media center recorded TV files on my laptop in bed. I find thought that when the programs are in HD I have a lot of stuttering and delays--no doubt b/c of the amount of data being transferred.
I actually have a fair amount of space on the laptop's hdd, and wouldn't mind moving the files onto that hard drive, where no doubt my problem would go away. But that requires some planning & time for the files to move.
Is there a utility out there that would kind of 'trickle' the files over to the laptop over a long period of time, w/out soaking its bandwidth? Something like ms' BITS tech?
Both machines are running win7.
Many thanks!
-Roy
I've got an unresponsive Rackspace slice that has defied all attempts at accessing. I created an emergency image from this and deleted it, downloading the files that compromise the image to a local source. There are a number of files / assets I would still like to recover from this server if possible but not sure exactly what I can do with the image files, if anything.
Here's the files I have, for what its worth:
emergency_########_######_cloudserver########.tar.gz.0 (5gb)
emergency_########_######_cloudserver########.tar.gz.1 (5gb)
emergency_########_######_cloudserver########.tar.gz.2 (5gb)
emergency_########_######_cloudserver########.tar.gz.3 (50mb)
emergency_########_######_cloudserver########.yml (25kb)
Is it possible to mount this image as a drive? Are there other forensic recovery options?
My question is pretty simple and is actually stated in the title. One of my applications throws errors regarding "too many open files" at me, even tho the limit for the user the application runs with is higher than the default of 1024 (lsof -u $USER reports 3000 open fds).
Because I cannot imagine why this happens, I guess there might be a maximum per process.
Any idea is very appreciated!
Edit: Some values that might help...
root@Debian-60-squeeze-64-minimal ~ # ulimit -n
100000
root@Debian-60-squeeze-64-minimal ~ # tail -n 4 /etc/security/limits.conf
myapp soft nofile 100000
myapp hard nofile 1000000
root soft nofile 100000
root hard nofile 1000000
root@Debian-60-squeeze-64-minimal ~ # lsof -n -u myapp | wc -l
2708
Problem:
Whether I'm playing the media with Rhythmbox on Ubuntu, Winamp on Windows, or my Nokia N95's media player, most of my audio files (OK, maybe only 40%) play twice.
Info:
I have a 500GB external 2.5" WD HDD, with a 150GB primary FAT32 partition labeled MUSIC.
Inside this, I have about 500 folders containing about 10,000 MP3/WMA/M4A/WAV files.
I manage the drive using Ubuntu 9.10, and frequently copy data to/from it using RSYNC, or on windows, TotalCopy.
The visual output is different in each media player, but it behaves as if the 1 MP3 has the same song on it twice, and as soon as it ends it begins again.
Winamp shows that the song goes for 2x as long as it should, The N95's media player shows the progress bar off the right-hand-side of the screen when it begins playing (then jumps back to the left, then continues along...).
Rhythmbox doesn't show me how long the song is, nor does the progress bar move along the screen.
Plea:
It seams to me somewhere along the lines my collection has become corrupt... but where? And how? and please someone tell me I can fix it!!
TIA, Dean.
I am running windows server with asp.net websites and sql server 2008 and IIS 6.
It is working fine.
Now I need to move my asp.net websites to another windows server and I have hard time setting correct file security for the new server.
Is there any way to compare or move or see difference file security between two servers?
Hi, I'd like to know how to configure tomcat 6 in order to be able to replace a file from a war - for instance an image or a jsp - so I don't need to restart the server to keep it updated. I assume I'll have to deploy it as a directory - not just copying the war file to webapp ?
Thanks.
Hey everyone,
In my site, I currently only allow users to import images from other sites rather than uploading it themselves. The main reason for this is because I don't have much storage space on my host (relatively speaking). The host charges quite a bit for additional space. What are the alternatives to hosting images users upload (max 1mb size). Would it be a good idea to purchase separate cheap hosting with "unlimited space" (I know that's not true, but I'm guessing it's more than 1gb)? Or are there some caveats with this approach (e.g. security since the site should not be browsable, but accessed via another server)? Are there alternative ideas that I could employ?
Thanks for any suggestions
I have this bash script which nicely backs up my database on a cron schedule:
#!/bin/sh
PT_MYSQLDUMPPATH=/usr/bin
PT_HOMEPATH=/home/philosop
PT_TOOLPATH=$PT_HOMEPATH/philosophy-tools
PT_MYSQLBACKUPPATH=$PT_TOOLPATH/mysql-backups
PT_MYSQLUSER=*********
PT_MYSQLPASSWORD="********"
PT_MYSQLDATABASE=*********
PT_BACKUPDATETIME=`date +%s`
PT_BACKUPFILENAME=mysqlbackup_$PT_BACKUPDATETIME.sql.gz
PT_FILESTOKEEP=14
$PT_MYSQLDUMPPATH/mysqldump -u$PT_MYSQLUSER -p$PT_MYSQLPASSWORD --opt $PT_MYSQLDATABASE | gzip -c > $PT_MYSQLBACKUPPATH/$PT_BACKUPFILENAME
Problem with this is that it will keep dumping the backups in the folder and not clean up old files. This is where the variable PT_FILESTOKEEP comes in. Whatever number this is set to thats the amount of backups I want to keep. All backups are time stamped so by ordering them by name DESC will give you the latest first.
Can anyone please help me with the rest of the BASH script to add the clean up of files? My knowledge of BASH is lacking and I'm unable to piece together the code to do the rest.
I'm curious to know if there are any tools for restoring disk images (or even transferring files) via multicast -- for any platform, especially if the project has source available -- where the multicast rate adjusts itself on the fly.
On the Mac, all multicast solutions I am aware of (such as Deploy Studio, and NetRestore before it) make use of multicast ASR (apple software restore), which has one glaring deficiency -- you have to set the multicast speed before you start sending a disk image over the network, and that speed is locked in. Either your clients can keep up and restore, or they can't*.
It seems to me that it must be possible for the multicast server to adjust the data rate, so you basically say "start sending this image", clients connect, and, if they can't keep up, they tell the server so it slows down. (Likewise, I'd expect the server to try speeding up if no client is having difficulties keeping up, and I'd expect to be able to cap that maximum throughput so that other network activities can go on without being resource starved.)
So, what sort of tools are out there? For Linux? Windows? Is there something for the Mac I've overlooked. [It just kills me that it is true that, by the time you get multicast up and going at a good speed to restore a lab, you could've unicasted the data to all the computers and be done.]
* There is a little leeway involved. I think individual clients can say, "I missed a little bit of data" and get it, and they can opt to listen in the next time the image is sent over the network, but on the whole, if they missed it the first go round, you have to image the machine again, and there is no time savings.