Search Results

Search found 22961 results on 919 pages for 'memory management'.

Page 618/919 | < Previous Page | 614 615 616 617 618 619 620 621 622 623 624 625  | Next Page >

  • Insert Hyperlink via VBA

    - by Martin
    I have a Word VBA macro that loops through a directory and writes down the file path of files selected for some criteria into a new Word document. Works well as plain text (as part of a loop): wdDocResults.Content.InsertAfter objFile.Path & Chr(13) However, I'd like them to be hyperlinks. The following works as single macro, but when called from within another script, it does nothing at all (no matter if path is provided as variable or string, or as H:... or \\MyServernameAsNetDrive...): ActiveDocument.Hyperlinks.Add Anchor:=Selection.Range, Address:= objFile.Path, _ SubAddress:="", ScreenTip:="", TextToDisplay:=objFile.Path If try to select the current line in order to make sure something is selected at the right place -- error: out of memory": wrdDocResults.Content.InsertAfter objFil.Path Selection.Expand wdLine ActiveDocument.Hyperlinks.Add Anchor:=Selection.Range, Address:= objFile.Path, _ SubAddress:="", ScreenTip:="", TextToDisplay:=objFile.Path I also tried inserting a string resembling the Hyperlink field code ({ Hyperlink "..." }, which is of course not recognized... Any help is appreciated... Thanks in advance!

    Read the article

  • setting up a private network using linksys router

    - by user287745
    scenerio:- a database server running sql server 2005 and sql server management studio 2005 express editions a web server running IIS 5.0v using windows xp pro. two other computer having windows xp and windows 98 i have a linksys router which i use to access point for wireless (laptop) there are 5 sockets behind it four for clients and one for internet. i would like to setup a LAN- something like a private hosting area with two clients. would should i do? where to connect what and what would the changes in settnigs be. right now it uses dhcp or something to assign ips. where will the webserver be attached to the internet socket? where will the db server be attached? any guide, links, help thank you

    Read the article

  • Apps not starting on Mac OS X Lion

    - by KPS
    I have a strange problem, some apps that I download do not launch. There are 2 scenarios The application starts but keeps bouncing, once I click the bouncing icon in the tray it disappears. The process also closes in activity monitor The application fails to start, how ever many times I launch it, it refuses. The process does not even show up in activity monitor What I have done so far to resolve: Repaired permissions Clear all cache/temp using CleanMyMac Used DiskWarrior via OS and not during boot Additional Info: Model Name: MacBook Pro Model Identifier: MacBookPro5,2 Processor Name: Intel Core 2 Duo Processor Speed: 2.8 GHz Number of Processors: 1 Total Number of Cores: 2 L2 Cache: 6 MB Memory: 8 GB Bus Speed: 1.07 GHz Boot ROM Version: MBP52.008E.B05 SMC Version (system): 1.42f4

    Read the article

  • How Can I prevent a specific application from being run on a specific machine using Group Policy?

    - by Mike
    I know this is possible to do and I am working on it with limited success. I believe the Group Policy I want is "Do Not Run Specified Windows Applications" - I can enable this and add the .exe I want to the list of programs not to be run. I have tried this on my local machine by running gpedit.msc going to User Config Admin Templates System and then choosing that policy and editing and enabling it. Doing it this way verifies that it works as I could then not run the specified .exe (XenAppWeb.exe) So this is great. I have created a GPO to do the same thing in GP Management on my domain controller where we centralize this, enforced it, applied it to an OU, and put one of our machines into this OU to test it. I have let it sit there for 3 days, run gpupdate /force, and when I try to run XenAppWeb.exe on this machine, it still lets me run it fine. What can I look at to troubleshoot this? I should note that I am trying to enact this policy on Windows XP machines (Virtual Machines) Thanks, Mike

    Read the article

  • Computer turns off and on after start ..then goes dead

    - by Shiki
    I built a new PC from the following components: - CPU: Intel Core i7 950 - MB: Gigabyte X58A-UD3R - RAM: 2x2gb i7 Corsair memory - VGA: Zotac AMP2 GTX260 - HDD: 1 GreenSATA HDD (Western Digital 500gb RE2) When I turn it on, it goes for a few seconds, fans at maximum speed, then turns off. The again, it starts by itself.. and goes with fans on max speed, nothing happens. First I suspected my PSU. It's a Chieftec 450AA PSU. After I borrowed a Chieftec 550AA PSU, I tried to start with that. Exact same story. Any idea ? Do I need a bigger PSU? Reason why its not localized. I never seen this turn on, off, on. If you give answer for that, it would already help people like me, with the same problem.

    Read the article

  • How to create a filesystem mountable by windows in linux?

    - by wcoenen
    I have attached an external USB disk to my debian gnu/linux system. The disk showed up as device /dev/sdc, and I prepared it like this: created a single partition with fdisk /dev/sdc (and some more commands in the interactive session that follows) formatted the partition with mkfs.msdos /dev/sdc1 If I then attach the USB disk to a Windows XP or Vista system, then no new drive becomes available. The disk and its partition show up fine in the disk managment tool under "computer management", but apparently the file system in the partition is not recognized. How do I create a FAT32 file system which can actually be used in windows? edit: I've given up on this and went with a NTFS file system created by windows. In debian lenny this can be mounted read-write but apparently it requires you to install the "ntfs-3g" package and explicitly pass the -t ntfs-3g option to the mount command.

    Read the article

  • CSC folder data access AND roaming profiles issues (Vista with Server 2003, then 2008)

    - by Alex Jones
    I'm a junior sysadmin for an IT contractor that helps small, local government agencies, like little towns and the like. One of our clients, a public library with ~ 50 staff users, was recently migrated from Server 2003 Standard to Server 2008 R2 Standard in a very short timeframe; our senior employee, the only network engineer, had suddenly put in his two weeks notice, so management pushed him to do this project before quitting. A bit hasty on management's part? Perhaps. Could we do anything about that? Nope. Do I have to fix this all by myself? Pretty much. The network is set up like this: a) 50ish staff workstations, all running Vista Business SP2. All staff use MS Outlook, which uses RPC-over-HTTPS ("Outlook Anywhere") for cached Exchange access to an offsite location. b) One new (virtualized) Server 2008 R2 Standard instance, running atop a Server 2008 R2 host via Hyper-V. The VM is the domain's DC, and also the site's one and only file server. Let's call that VM "NEWBOX". c) One old physical Server 2003 Standard server, running the same roles. Let's call it "OLDBOX". It's still on the network and accessible, but it's been demoted, and its shares have been disabled. No data has been deleted. c) Gigabit Ethernet everywhere. The organization's only has one domain, and it did not change during the migration. d) Most users were set up for a combo of redirected folders + offline files, but some older employees who had been with the organization a long time are still on roaming profiles. To sum up: the servers in question handle user accounts and files, nothing else (eg, no TS, no mail, no IIS, etc.) I have two major problems I'm hoping you can help me with: 1) Even though all domain users have had their redirected folders moved to the new server, and loggin in to their workstations and testing confirms that the Documents/Music/Whatever folders point to the new paths, it appears some users (not laptops or anything either!) had been working offline from OLDBOX for a long time, and nobody realized it. Here's the ugly implication: a bunch of their data now lives only in their CSC folders, because they can't access the share on OLDBOX and sync with it finally. How do I get this data out of those CSC folders, and onto NEWBOX? 2) What's the best way to migrate roaming profile users to non-roaming ones, without losing vital data like documents, any lingering PSTs, etc? Things I've thought about trying: For problem 1: a) Reenable the documents share on OLDBOX, force an Offline Files sync for ALL domain users, then copy OLDBOX's share's data to the equivalent share on NEWBOX. Reinitialize the Offline Files cache for every user. With this: How do I safely force a domain-wide Offline Files sync? Could I lose data by reenabling the share on OLDBOX and forcing the sync? Afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? b) Determine which users have unsynced changes to OLDBOX (again, how?), search each user's CSC folder domain-wide via workstation admin shares, and grab the unsynched data. Reinitialize the Offline Files cache for every user. With this: How can I detect which users have unsynched changes with a script? How can I search each user's CSC folder, when the ownership and permissions set for CSC folders are so restrictive? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? c) Manually visit each workstation, copy the contents of the CSC folder, and manually copy that data onto NEWBOX. Reinitialize the Offline Files cache for every user. With this: Again, how do I 'break into' the CSC folder and get to its data? As an experiment, I took one workstation's HD offsite, imaged it for safety, and then tried the following with one of our shop PCs, after attaching the drive: grant myself full control of the folder (failed), grant myself ownership of the folder (failed), run chkdsk on the whole drive to make sure nothing's messed up (all OK), try to take full control of the entire drive (failed), try to take ownership of the entire drive (failed) MS KB articles and Googling around suggests there's a utility called CSCCMD that's meant for this exact scenario...but it looks like it's available for XP, not Vista, no? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? For problem 2: a) Figure out which users are on roaming profiles, and where their profiles 'live' on the server. Create new folders for them in the redirected folders repository, migrate existing data, and disable the roaming. With this: Finding out who's roaming isn't hard. But what's the best way to disable the roaming itself? In AD Users and Computers, or on each user's workstation? Doing it centrally on the server seems more efficient; that said, all of the KB research I've done turns up articles on how to go from local to roaming, not the other way around, so I don't have good documentation on this. In closing: we have good backups of NEWBOX and OLDBOX, but not of the workstations themselves, so anything drastic on the client side would need imaging and testing for safety. Thanks for reading along this far! Hopefully you can help me dig us out of this mess.

    Read the article

  • Limit number of simultaneous connections squid makes to a single server

    - by Ben Voigt
    Note: I am asking about outbound concurrent connection limits, not inbound, which is sufficiently covered on existing questions Modern browsers typically open a large number of simultaneous connections, to take advantage of the fact that TCP fairly shares bandwidth between connections. Of course, this doesn't result in fair sharing between users, so some servers have started penalizing hosts which open too many connections. This limit can be configured client-side (e.g. IE MaxConnectionsPerServer, Firefox network.http.max-connections-per-server), but the method differs for each browser and version, and many users aren't competent to adjust it themselves. So we turn to a squid transparent HTTP proxy for central management of HTTP download. How can the number of simultaneous connections from squid to a remote webserver be limited, so the webserver doesn't perceive it as abuse of concurrent connections? Ideally the limit would be per source address. Squid should accept virtually unlimited concurrent requests from the client browser, and issue them sequentially to the remote server, only N at a time, delaying (but not dropping) the others.

    Read the article

  • Asus laptop not retaining Aero theme when unplugged

    - by expiredninja
    The problem i'm having occurs when i unplug my laptop, it disregards several elements of my theme. The taskbar becomes unlocked. My desktop background is turned white. The transparency on my windows goes away. I can fix these things by running the Aero trouble shooting utility, which mentions something about the color bit depth These are these specs for the computer: Asus Laptop / Intel® Pentium® Processor / 15.6" Display / 4GB Memory Model: X54H-BD3MA SKU: 4005394 I'm sure there are duplicates of this, just not sure whick one refers to me. Thank you. Also i think someone should add an "unplugged" tag.

    Read the article

  • Problem opening password encrypted .docx file on Word 2003

    - by molecule
    Hi all, I am having a problem opening a .docx file on my Word 2003. I have installed the Compatibility pack for 2007 but when i try to open this particular file, I receive the error "Word experienced an error trying to open the file. Try these suggestions. 1. Check the file permissions for the document, 2. Make sure there is sufficient free memory and disk space, 3. Open the file with the Text Recovery converter. I do not think it is any of the errors as I am able to open it on a different PC with Word 2003 as well. I also do not have any issues opening any non-password encrypted .docx files. Has anyone experienced the same issue? Most posts on the internet relate to "open and repair" but as mentioned, I am able to open this file on another PC without any problems. Any advice is greatly appreciated. Thanks, George

    Read the article

  • Which Windows OS Supports 8 GB RAM in a Laptop and Suggestions for a Better Laptop for Personal & De

    - by Ellen
    I am about to purchase a laptop and have zeroed on the following two of them. Toshiba L500-ST2544 Toshiba L505-ES5034 The Common Specification for both of them are as follows - RAM - 4GB DDR3 Memory HDD - 320 GB Processor - Intel® Core™ i3-330M Processor WebCam and Mic - Available HDMI Port - Available Numeric Key Pad - Available Windows 7 (64 bit) Home Premium Now, the only difference between ST2544 and ES5034 is that, the ST2544 has a maximum of 2 slots with 2 GB in each. So, you can have a max of 4 GB RAM in that. The ES5034 can support 8 GB RAM, so, in a couple of years, if I want to add another 4 GB RAM I will be able to do it. The price for ST2544 is USD 629.00 whereas, the price for a ES5034 is USD685. A difference is USD 55.00 (not a major amount, but still something extra). Is it worthwhile going for the ES5034? Which Windows Operating System supports 8 GB of RAM?

    Read the article

  • Nginx + PHP-FPM on Centos 6.5 gives me 502 Bad Gateway (fpm error: unable to read what child say: Bad file descriptor)

    - by Latheesan Kanes
    I am setting up a standard LEMP stack. My current setup is giving me the following error: 502 Bad Gateway This is what is currently installed on my server: Here's the configurations I've created/updated so far, can some one take a look at the following and see where the error might be? I've already checked my logs, there's nothing in there (http://i.imgur.com/iRq3ksb.png). And I saw the following in /var/log/php-fpm/error.log file. sidenote: both the nginx and php-fpm has been configured to run under a local account called www-data and the following folders exits on the server nginx.conf global nginx configuration user www-data; worker_processes 6; worker_rlimit_nofile 100000; error_log /var/log/nginx/error.log crit; pid /var/run/nginx.pid; events { worker_connections 2048; use epoll; multi_accept on; } http { include /etc/nginx/mime.types; default_type application/octet-stream; # cache informations about FDs, frequently accessed files can boost performance open_file_cache max=200000 inactive=20s; open_file_cache_valid 30s; open_file_cache_min_uses 2; open_file_cache_errors on; # to boost IO on HDD we can disable access logs access_log off; # copies data between one FD and other from within the kernel # faster then read() + write() sendfile on; # send headers in one peace, its better then sending them one by one tcp_nopush on; # don't buffer data sent, good for small data bursts in real time tcp_nodelay on; # server will close connection after this time keepalive_timeout 60; # number of requests client can make over keep-alive -- for testing keepalive_requests 100000; # allow the server to close connection on non responding client, this will free up memory reset_timedout_connection on; # request timed out -- default 60 client_body_timeout 60; # if client stop responding, free up memory -- default 60 send_timeout 60; # reduce the data that needs to be sent over network gzip on; gzip_min_length 10240; gzip_proxied expired no-cache no-store private auth; gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml; gzip_disable "MSIE [1-6]\."; # Load vHosts include /etc/nginx/conf.d/*.conf; } conf.d/www.domain.com.conf my vhost entry ## Nginx php-fpm Upstream upstream wwwdomaincom { server unix:/var/run/php-fcgi-www-data.sock; } ## Global Config client_max_body_size 10M; server_names_hash_bucket_size 64; ## Web Server Config server { ## Server Info listen 80; server_name domain.com *.domain.com; root /home/www-data/public_html; index index.html index.php; ## Error log error_log /home/www-data/logs/nginx-errors.log; ## DocumentRoot setup location / { try_files $uri $uri/ @handler; expires 30d; } ## These locations would be hidden by .htaccess normally #location /app/ { deny all; } ## Disable .htaccess and other hidden files location /. { return 404; } ## Magento uses a common front handler location @handler { rewrite / /index.php; } ## Forward paths like /js/index.php/x.js to relevant handler location ~ .php/ { rewrite ^(.*.php)/ $1 last; } ## Execute PHP scripts location ~ \.php$ { try_files $uri =404; expires off; fastcgi_read_timeout 900; fastcgi_pass wwwdomaincom; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } ## GZip Compression gzip on; gzip_comp_level 8; gzip_min_length 1000; gzip_proxied any; gzip_types text/plain application/xml text/css text/js application/x-javascript; } /etc/php-fpm.d/www-data.conf my php-fpm pool config ## Nginx php-fpm Upstream upstream wwwdomaincom { server unix:/var/run/php-fcgi-www-data.sock; } ## Global Config client_max_body_size 10M; server_names_hash_bucket_size 64; ## Web Server Config server { ## Server Info listen 80; server_name domain.com *.domain.com; root /home/www-data/public_html; index index.html index.php; ## Error log error_log /home/www-data/logs/nginx-errors.log; ## DocumentRoot setup location / { try_files $uri $uri/ @handler; expires 30d; } ## These locations would be hidden by .htaccess normally #location /app/ { deny all; } ## Disable .htaccess and other hidden files location /. { return 404; } ## Magento uses a common front handler location @handler { rewrite / /index.php; } ## Forward paths like /js/index.php/x.js to relevant handler location ~ .php/ { rewrite ^(.*.php)/ $1 last; } ## Execute PHP scripts location ~ \.php$ { try_files $uri =404; expires off; fastcgi_read_timeout 900; fastcgi_pass wwwdomaincom; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } ## GZip Compression gzip on; gzip_comp_level 8; gzip_min_length 1000; gzip_proxied any; gzip_types text/plain application/xml text/css text/js application/x-javascript; } I've got a file in /home/www-data/public_html/index.php with the code <?php phpinfo(); ?> (file uploaded as user www-data).

    Read the article

  • 1tera flop cluster?

    - by Adobe
    I want to buy a $40000 1 tera flop cluster to keep it in a room. What are the standard configurations? Cluster is supposed to do molecular dynamics simulations on biological systems. I'm proposed a 4 pc with 8 cores each by the selling company I'm deadling with. It looks like I also need infiniband. Does some one has an experience -- what phisical memory should I buy etc? I know things change very quickly... Still there might be a point or two to state. Edit: OS is supposed to be linux, application is gromacs.

    Read the article

  • How to get external 2.5 inch HD enclosure get recognised

    - by fireBand
    I am completely out of ideas. I have 2.5inch HDD from my old laptop, I have tried with 2 different HDD enclosures but my desktop does not recognize it. Not even showing in the disk management or BIOS. The HDD works fine if I use in a 3.5 inch enclosure with external power. I have tried 3 different Y USB cables. I have also got a 7 port self powered 2A USB hub. I have updated my chipset drivers as well. The problem is with my desktop , the enclosures work fine with my new laptop. Anything else that I should try? Thanks in advance.

    Read the article

  • What's causing "shutdown state" after TFTP reloaded Cisco `running-config` on 871?

    - by xtian
    Cisco CCP Write Configuration borked my 871w config while I was trying to setup port forwarding. I tested the 871's flash memory with fsck and rewrote the minimal config for TFTP (which is the same for Cisco's CCP app.). Thne, I successfully uploaded a previously working running-config from Win Vista using SolarWinds TFTP Server, unfortunately the restore was not entirely successful. The old running config was saved to the 871's startup-config and I can login using console port. Some other things that are working are the hostname and welcome message but that's about it. Startup shows an error SETUP: new interface NVI0 placed in "shutdown" state after tftp. The missing light on the access point modem for ethernet link show the 871'a outside FE4 is not working. SO...what's the possible problem with reloading a previously working config (approximately 4 months with the same config) via TFTP? Is there something I can look for on the 871 to verify the config?

    Read the article

  • Raid recovery in gigabyte GA-8I945 Pro

    - by epeleg
    This was a working machine until a few days ago. And now it won't boot into the OS, during startup if makes clicking sounds (I think from one of the drives). Installed OS: Windows 2003 Web edition Hardware: Gigabyte GA-8I945P Pro , 2*160G Sata in RAID1 configuration , 2 Volumes – 25G and the rest. When I installed windows on it, during setup, I pressed F6 and used ICH7DH drivers of RAID. The manual for the MOBO says: Step 1: After the POST memory test begins and before the operating system boot begins, look for a message which says "Press to enter Configuration utility" (Figure 4). Press CTRL+ I to enter the RAID BIOS setup utility. But the machine never shows this message. BIOS SATA RAID/AHCI Mode is set to RAID. Any ideas or pointers on what I can do to recover my data? Thanks

    Read the article

  • How to add addtional disks to a Windows 2008 KVM based Guest?

    - by taazaa
    I have a Win 2008 KVM based guest VM running on a Ubuntu 10 host. It is a raw image of 22G. I want to add a "data" drive which would show up as "D:\" drive on the guest. I first created a raw image using: qemu-img create -f raw ~/vmdisk2.img 50G Then, tried attaching it using virsh attach-disk. When that did not work, I tried editing the xml file of the VM directly. Both did not seem to work. I would greatly appreciate any help on how to do this and what the best practice is. I want to keep the base image small, so that I can clone it (hopefully) and then attach necessary storage based on the application at hand. Update: The xml of the vm before adding the second drive: <domain type='kvm'> <name>win08e-vm1</name> <uuid>183a4ba0-1c0b-0b04-ad01-aa7c3a4cb390</uuid> <memory>1048576</memory> <currentMemory>1048576</currentMemory> <vcpu>2</vcpu> <os> <type arch='x86_64' machine='pc-0.12'>hvm</type> <boot dev='hd'/> </os> <features> <acpi/> <apic/> <pae/> </features> <clock offset='localtime'/> <on_poweroff>destroy</on_poweroff> <on_reboot>restart</on_reboot> <on_crash>restart</on_crash> <devices> <emulator>/usr/bin/kvm</emulator> <disk type='file' device='disk'> <driver name='qemu' type='raw'/> <source file='/var/lib/libvirt/images/win08e-vm1.img'/> <target dev='hda' bus='ide'/> <address type='drive' controller='0' bus='0' unit='0'/> </disk> <disk type='file' device='cdrom'> <driver name='qemu' type='raw'/> <source file='/home/taazaa/iso/Win08ER264.iso'/> <target dev='hdc' bus='ide'/> <readonly/> <address type='drive' controller='0' bus='1' unit='0'/> </disk> <controller type='ide' index='0'> <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x1'/> </controller> <interface type='bridge'> <mac address='52:54:00:7f:a7:ae'/> <source bridge='br0'/> <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0'/> </interface> <serial type='pty'> <target port='0'/> </serial> <console type='pty'> <target type='serial' port='0'/> </console> <input type='tablet' bus='usb'/> <input type='mouse' bus='ps2'/> <graphics type='vnc' port='-1' autoport='yes' keymap='en-us'/> <video> <model type='vga' vram='9216' heads='1'/> <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0'/> </video> <memballoon model='virtio'> <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0'/> </memballoon> </devices> </domain> Thanks!

    Read the article

  • users connecting to remote desktop to use software

    - by Jordan
    Im an IT assistant at a CNC milling company and we use a program called made2manage. Its an ERP (enterprise resource management) software. Each license is something like 5k and instead of giving each employee there own copy of the software he has everyone that uses the program connect to a server that has a copy of m2m on it. Its slow when there are a bunch of people connected to it. But I guess they dont want to buy more licenses. Is there a better way to do something like this? How bad of a practice is this?

    Read the article

  • What happens if an OpenStack cloud controller dies?

    - by magu
    I've been reading up on OpenStack and how we can re-create an EC2/S3-style cloud for our internal development and I'm having a hard time finding information on how the OpenStack cloud controller provides redundancy of the cloud management services. I know I can setup multiple Swift and Nova nodes, but not a single document/article/howto/wiki contains information on: a) what happens if the cloud controller node dies; and b) how to setup redundant cloud controllers. It seems to me that, although it is massively scalable, there is a big single-point-of-failure built into OpenStack. Can anyone with more experience on OpenStack please shed some light as to how it all works in regards to high-availability?

    Read the article

  • Which CMS for a mobile app? No HTML, just XML or JSON

    - by Sascha
    I am a newbie in content management systems. I would need a CMS that can transfer content by XML or JSON to a client. It is ok if the CMS can also manage HTML websites, but the priority is on the data transfer over a web service. Which is the best CMS to use here? I want to avoid spending endless hours learning all the big CMS systems just to find out that they don't support this feature or that it's badly integrated. Thanks.

    Read the article

  • certificate error while subdomain forwarding

    - by rahulchandran
    I have a website, call it http://sub.example.com, hosted on, say, 72.xx.xx.x. There is a certificate for https://sub.example.com. Now I go into the DNS management tool in my hosting provider, and I set up the standard subdomain forwarding wherein https://sub.example.com forwards to 72.xx.xx.x. Now when I try to browse to https://sub.example.com, I get a certificate error saying it is for the wrong website. I have also tried forwarding http://sub.example.com to 72.xx.xx.x, and tried it with domain masking in both cases. I am still getting the certificate error no matter what. Additional wrinkle: if someone types in https://sub.example.com then the domain forwarding does not seem to work and IE just spins endlesssly and finally fails. How can I domain forward the https://sub.example.com to 72.xx.xx.x?

    Read the article

  • How to determine non-movable files in Windows 7?

    - by David
    Is there a way to determine which unmovable files are preventing Shrink Volume from releasing the full potential free space? Background: I have a 90 GB partition with Windows 7 on it, and 60 GB free space. I want to shrink it down to about 40 GB, and use the reclaimed 50 GB for a separate data partition. The Shrink Volume tool in Disk Management is only willing to give me 8 GB back. My understanding is that this is because of immovable files. I've followed the instructions found here, which involved disabling hibernation, pagefile, system restore, kernal dump, making sure all related files were deleted, and defrag'ing. I have successfully followed those same instructions before on this same drive, and partitoned the original 150 GB space into 90 GB and 60 GB, but I'm not so lucky this time.

    Read the article

  • mongod fork vs nohup

    - by Daniel Kitachewsky
    I'm currently writing process management software. One package we use is mongo. Is there any difference between launching mongo with mongod --fork --logpath=/my/path/mongo.log and nohup mongod >> /my/path/mongo.log 2>&1 < /dev/null & ? My first thought was that --fork could spawn more processes and/or threads, and I was suggested that --fork could be useful for changing the effective user (downgrading privileges). But we run all under the same user (process manager and mongod), so is there any other difference? Thank you

    Read the article

  • Problem with running a flash Program from flashdrive

    - by rajivpradeep
    Hi, I have a USB drive with two partitions in it. one hidden and the other normal. i have an application which swaps the memory and runs the flash application in hidden zone. The problem is that the application works fine on windows7 and when run on WINXp, it swaps the partitions but doesn't run the flash applications but just keeps running in the background. I can see it in task manager. But, when i copy the application to desktop and run , it runs with no glitch I was facing the same problem on WIN7 too , but it was running as required when i ran it using "Run in Xp mode" and then i applied a SHIM and is running since then as required. the application is built using VC++ 2008 , What might be the problem.

    Read the article

  • FTP server (vsftpd) with webgui

    - by manutenfruits
    I want to build a file server to make users able to upload and download mostly multimedia, but also common files. Right now I have an Arch installation with vsftpd and I'm about to install miniDLNA for multimedia sharing. The only problem is that FTP doesn't seem to fit my needs, because almost always makes the users need a client such as FileZilla to make the server friendly. I have been looking for a web frontend for vsftp but apart from management interfaces there's nothing. I need a frontend accessible from a browser through which users can navigate throught the folders in an easier and more elegant way than the plain FTP display that browsers make by default. It should be able to let users upload files and, as an awesome extra, let them play the multimedia directly on the browser. For this, I am willing to dump FTP if needed, I've heard about HTTP File Servers but don't know too much about it. I could code everything myself, but there's gotta be something out there already.

    Read the article

< Previous Page | 614 615 616 617 618 619 620 621 622 623 624 625  | Next Page >