My company is evaluating a plan to migrate from Windows to Linux.
Can you suggest something in Linux analogous to roaming user profiles and domain users in an Active Directory environment?
For a Linux or Windows system, what tricks do you do to optimize your Subversion server?
The following are my current tricks for a Linux system serving over Apache with HTTPS and backed by Active Directory using LDAP authentication.
Enabling KeepAlive on Apache
Disable SVNPathAuthz
Increase LDAP Cache
Using the FSFS storage method instead of BDB
Feel free to call this into question. I don't have hard proof that FSFS out performs BDB, only lots of tribal knowledge and hearsay.
We have a SQL server machine - It’s a VMWare image (running on ESXi hardware etc..)
It has windows 2008 x64 standard
The SQL install is SQL 2008 standard
The virtual machine has 12gb of RAM, and 4 virtual CPU
The box is suffering from near 100% CPU a lot of the time
I enabled the AWE- but SQL server only seems to use 3-4gb of RAM
Is there a way of making it use more available ram more effectively?
cache results for example..?
Hi,
Since switching to Windows 7 for my desktop I've started to get really p***ed off at the length of time it takes to start the event viewer to display the application event log (typically 20-30 secs or disk griding - presumably to load and cache all the events)
I've just noticed that on server 2008 R2 it seems instantaneous.
Is my experience typical? Is there any setting I can tweak to make it fast on Windows 7 as well?
Tim
We plan to change the default server options of an SQL2k5 server instance by enabling distributed queries.
The reason is that we want to run "SELECT * FROM OPENQUERY(LOCALSERVER, '...')" -like statements on the server.
What are the possible disadvantages of enabling distributed queries?
(There must be a reason for MS setting this option to disabled by default...)
we got 3 Floors and almost 11 Switches. we are going to connect these switches so got the best performance. so this is how we got that.
Network Diagram
My Questions are:
Is this Plan can get the best performance?
Is it different to use 1x48 port Switch instead of 2x24 port switches? if no, Why?
Any Suggestion!?
I run dnsmasq locally as a cache server, in the old days, I allow all INPUT packets from lo+, and set policy of INPUT to DROP:
-A INPUT -i lo+ -j ACCEPT
Now I decide to put this on the raw table to speed up rules matching,
-A PREROUTING -i lo+ -j ACCEPT
But that doesn't work as expected. Why? Since the packets get processed by the raw table first, then nat, then filter, why isn't that rule work the same as the old one?
I'm working on a new/fresh Windows 7 32bit machine that now has IE9 installed. The user is using the Dell Stardock application as his primary "desktop" (all his links there). When we place an internet link there and click on it we get the following error message:
There was a problem sending the command to the program.
To me this indicates that IE9 is having trouble going to the website we want to go to, which should get passed as a parameter to the browser when it opens.
I don't think this is a StarDock/ObjectDock problem because we also have some other problems with internet links. For example, we cannot move an internet link from the Desktop to the Quick Launch on the task bar. When we do try, it forces us to put the link with the IE icon as part of the IE menu instead of allowing us to have a shortcut there as it's own entry. I should mention however, that links on the desktop and in the taskbar do work as we expect them too (without showing the above error message).
It appears that this problem started after installing Windows Updates. Since we installed a whole bunch of updates at once I have no idea which one caused the problem.
I did have Google Chrome installed but I uninstalled it since the user wants to use IE. The problem started before I uninstalled Chrome. I also reset the browser settings on IE9. It didn't help.
Next I uninstalled IE9 which took me back to IE8. This actually did resolve the problem but the problem came back as soon as I installed IE9 again.
We have Verizon Internet Security installed. It's actually a McAfee product rebranded to look like Verizon. I'm not real crazy over this software but the customer has a subscription so we're not planning to change it. I have no reason to believe that this is causing the problem and yet I know that security software is often to blame for strange issues.
I've looked at the registry settings for the following keys and everything appears to be ok for every single one of them:
HKEY_CLASSES_ROOT\.htm
HKEY_CLASSES_ROOT\.html
HKEY_CLASSES_ROOT\http\shell\open\command
HKEY_CLASSES_ROOT\http\shell\open\ddeexec\Application
HKEY_CLASSES_ROOT\https\shell\open\command
HKEY_CLASSES_ROOT\https\shell\open\ddeexec\Application
HKEY_CLASSES_ROOT\htmlfile\shell\open\command
HKEY_CLASSES_ROOT\Microsoft.Website\Shell\Open\Command
Edit1:
I've found two potential solutions but I won't be able to try them until tomorrow. One is to disable the "Windows Font Cache" service. Another is to clear IE cache and browsing history. I won't be able to try out either solution until tomorrow since this is a remote client's machine. I see there are lots of other suggestions online but if you take the time to read them through you'll see that the other suggestions didn't fix the problem.
I installed a Forwarding DNS server on Centos 5.10 and it is resolving addresses e.g google.com. When I stopped named (service named stop) and tried to dig (dig @localhost A google.com) there was a failure to resolve the address. I checked and see the caching daemon nscd is running.
Does this mean the server is not caching at all? How can I get it to cache?
named.conf
options
{
// Those options should be used carefully because they disable port
// randomization
// query-source port 53;
// query-source-v6 port 53;
// Put files that named is allowed to write in the data/ directory:
listen-on port 53 {127.0.0.1; 10.0.0.4;};
directory "/var/named"; // the default
dump-file "/var/named/chroot/var/named/data/cache_dump.db";
statistics-file "/var/named/chroot/var/named/data/named_stats.txt";
memstatistics-file "/var/named/chroot/var/named/data/named_mem_stats.txt";
// allow-query {localhost; 192.168.0.0/24; 10.0.0.0/8;};
recursion yes;
//allow-query { localhost; 10.0.0.0/8;};
allow-query { localhost; any; };
allow-query-cache { localhost; any; };
forward only;
forwarders {8.8.8.8; 8.8.4.4;};
dnssec-enable yes;
// dnssec-lookaside auto;
/* Path to ISC DLV key */
// bindkeys-file "/etc/named.iscdlv.key";
// managed-keys-directory "/var/named/dynamic";
};
logging
{
channel default_debug {
file "data/named.run";
severity dynamic;
};
};
**
How do you figure out the current size of the sharepoint web application? Better yet, the size of a site collection or a subsite.
I am planning to move a site collection from one farm to another. I need to plan the storage capacity first.
If I throw my MySQL/PHP app up on a Amazon EC2 instance (using their AWS Free Usage Tier program) and couple it with CloudFlare (the free plan of course) roughly how many daily visitors can I comfortably handle before performance starts to suffer? Just looking for a rough estimate or educated guess - I understand this setup might be less than ideal but I'm still very curious nonetheless.
Thanks in advance
Given a floor-plan, which is too big for any screen, even if it is a 17" one, how can I show it online like a map? It would need further functionality that a browser alone does not have (just zoom in/out the entire image won't do the trick). The image will be breaked down into smaller jpgs, so the user will not have to download the whole floorplan at once.It will need some zoom in/zoom out button, and some way or bookmarking position (x,y). open-source solutions prefered.
/etc/my.misc
sda1 -fstype=ntfs,user,exec :/dev/sda1
sda3 -fstype=ntfs,user,exec :/dev/sda3
sda4 -fstype=ntfs,user,exec :/dev/sda4
/etc/auto.master
/my /etc/my.misc --ghost
When I run locate .pdf, I get nothing because though the mount points (sda1, sda2, ..) are created in /my - there's nothing in them till I access them. Unfortunately this is not good enough for updatedb and it purges its cache of /my/sdaX files. How do I prevent/solve this problem?
Hi,
I am having trouble with my laptop. A problem that happens is a slowdown that occurs every 5 minutes or so (problem can be seen during games). The fan will start to speed up and the laptop will slow down to about 50% of its original fps. This will persist for about 30 seconds and then continue back.
Here are my specs:
Intel T3400 (2.16GHz, 2 cores, 1M cache)
3 GB RAM (PC2-5300 333MHz)
Mobile Intel 4 Series Express (1.26 GB VRAM)
I have a slow machine, mainly a Celeron with 250gb HD.
I'm planning to install a Linux distro and create a bunch of VMs for development.
Which distro should I choose? I plan to use this machine mainly as a small "hypervisor" to other vms.
Is it possible? What do you suggest?
Thanks!
I've heard many good things about Nginx lately, and I wanted to put it on my slicehost server. I am in a fix for ram, and would like to get Wordpress and wp-super-cache configured. I was just wondering the 'recommended way' of get PHP setup, because I see so many webpages saying their way is correct.
No compiling if possible please, it makes updating a drag D=
Lately, 7Stacks and Standalone Stacks have been unable to display the icon for any .exe file. They simply show the default icon instead (the white file icon). Rebuilding/fixing the icon cache haven't fixed the problem.
Any ides on how to ? I use Windows 7.
I am setting up Untangle in a Sun VirtualBox VM. I plan on using this machine as a transparent bridge to filter and monitor traffic on my network. I'm not sure how to configure the network adapters for the virtual machine under the Virtualbox's "Devices" menu so that it will function as a transparent bridge. I guess what I'm asking is, should both adapter 1 & 2 be set as Bridged adapters or what?
Any help is greatly appreciated.
Using powershell, I plan to run many functions on a remote host to gather information.
Here is an example to retrieve the content of file remotely just by running a function called getcontentfile with the parameter as the name of the remote host:
function getcontentfile
{
[CmdletBinding()]
param($hostname)
$info = Get-Content "C:\fileinfo.xml"
write-host $info
}
This function should return information about the remote host to the local instance of PowerShell. How can I modify this script to do that?
I've been able to use this linux command to connect Netcat to a serial port:
nc -l 80 <> /dev/ttyS0
I would like to be able to log this transaction. My backup plan is to use Wireshark to monitor the netcat stream, but ideally I'd like to do something like this:
cat /dev/ttyS0 | tee upstream.bin | nc -l 80| tee downstream.bin | /dev/ttyS0
This tries to open ttyS0 twice and therefore throws a permissions error. Does anyone know a smarter way to do this?
I've just got a really old MacBook and I want to upgrade it to the latest OS version. I do not know if it supports to be upgraded, the config of this MacBook is as below:
Model Name: MacBook
Model Identifier: MacBook1, 1
Processor Name: Intel Core Duo
Processor speed: 1.83 GHz
The number of processors: 1
Total Number Of Cores: 2
L2 cache: 2 MB
Memory: 1 GB
Bus speed: 667 MHz
Boot ROM Version: MB11.006
If I increase the memory, can I upgrade it?
Hi, am trying to perform an update on multiple site that use an open source CMS but untaring a patch file in each sites httpdocs directory. My plan was to perform a find for the patch file then untar using the following command:
find . -name "patchfile.tar.gz" -exec tar -xzvf {} \; -print
but it doesnt seem to work successfully
anyone have any ideas as to why not?
Many thanks.
How are users able to actually use features such as pinching if they don't have touchscreen monitors?
I ask this because:
I plan to use Windows 8 on my desktop, and I want to get the full use of any applications I download.
I want to ensure that if I ever release any application I develop to the windows store that there is a way for my users to get around this situation (no touchscreen for multitouch events).
I suspect that the most stressful to a HDD is being used as a target for downloading torrents.
I plan to get a new HDD and use the old one only for storing downloads in progress.
Is this true?
How much stress is there on an idle HDD?
:)
I know I can base on tutorials that I can find on net, but there are confusing, and I know there are some of Postgres gurus here.
My hardware:
2x Intel Xeon E5645 2,40 GHz / 32 GB of RAM / 2xSAS for pg_xlog (RAID1) + 3x2 SAS for rest (RAID10) + BBU and 512 MB of cache.
It's gonna be used for one web project, with some big (relative term :P) tables (20mil+ of records).
Server is dedicated to Posgtres service. I need help with configuring it - work_mem, effective_cache_size etc.