Search Results

Search found 2579 results on 104 pages for 'mike schall'.

Page 33/104 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • Postfix mailq - send every x minutes

    - by Mike
    I got about 2000 clients on my website that have subscribed to our mailing list. I've used in the past Swift Mailer but it didn't work the way it was supposed to. I'm wondering if there is a way that Postfix could keep emails on the mailq (if lots of emails are sent at the same time) and send chunks of 20-30 emails every 10-20 mins. So this way, our server is not blacklisted. Any suggestions will be appreciate it.

    Read the article

  • How can I find a laptop if it has a different model name all over the world?

    - by Mike
    CONUNDRUM: A laptop review in the UK talks about how brilliant the "ASUS ABCDE 55" is, but in America, France, etc there is no such laptop name. In fact it's called "ASUS 12345 AB" - AAARRGH! QUESTION: Is there a way of finding out all the diverse names for the same laptop all over the world? Example: if Samsung create a R2D2500, then what is that spec laptop called in all the other countries (if they release it of course). Or if it's not released, what is their similar spec laptop called in the other countries? I understand that specs may be different, but if I read a review on my trusted UK website, but live in say Australia, I want to be able to find the name of the same laptop in Australia and then check out local places to buy it. So if anyone knows if there is a technique, specific website, or even how to use a company website to find out these annoying name changes I'd really appreciate it.

    Read the article

  • How to back up a network volume to my Time Capsule?

    - by Mike
    I have a Time Capsule that I'm using for my backups. I have a network volume (coincidentally on the same time capsule) that I'd like to back up as well. How can I tell Time Machine to back up network volumes in addition to my main laptop hard drive? PS: yes, I know this setup isn't ideal. It'll incur 2x network overhead when backing up the network volume, plus my data won't be safe in the event of a drive failure since both copies will be on the same disk. However, it will give me some small amount of safety in the event I accidentally delete files on the network volume, among other things.

    Read the article

  • Facebook unresponsive in Chrome, pepper flash disabled

    - by Mike Pateras
    Facebook becomes unresponsive after a second or two when I open Chrome. This problem happens every few weeks or so, and it's incredibly annoying. I've disabled Chrome's internal pepper flash: http://imgur.com/8hlqvaX I've also tried disabling all extensions, and even tried it in incognito mode. Facebook works just fine in Firefox, but it's totally unusable in Chrome. The problem goes away eventually, but it returns every now and then and it's super annoying. Chrome version: 30.0.1599.101 (Official Build 227552) m How can I fix this?

    Read the article

  • Puppet yum repo - Pull down 2.7.x vs 3.0.x

    - by Mike Purcell
    So a few weeks ago I started on the path to using puppet to automate all the configs/services. At the time I was using the EPEL repo, which installed version 2.6.x. After some reading I was trying to gain access to the flatten method available via the puppet stdlib, and thought it was available by default in the newer 2.7.x version. So I added a puppet repo with the following settings: [puppetlabs] name=Puppet Labs Packages baseurl=http://yum.puppetlabs.com/el/$releasever/products/$basearch/ enabled=1 gpgcheck=1 gpgkey=http://yum.puppetlabs.com/RPM-GPG-KEY-puppetlabs The problem with this, is it installed v3.0.x instead of 2.7.x. And apparently 3.0.x is a major upgrade which was released only a few weeks ago. Obviously I would prefer to use the 2.7.x for the next few months while PuppetLabs fix any defects which will inevitably arise after a major version. So my question is, what setting can I add to the puppet repo config to pull down only the 2.7.x branch and not the 3.0.x branch?

    Read the article

  • How to automatically execute a shell script when logging into Ubuntu

    - by Mike Rowave
    How do I get a script to execute automatically when I log in? Not when the machine starts up, and not for all users, but only when I (or any specific user with the script) login via the GNOME UI. From reading elsewhere I thought it was .bash_profile in my home directory, but for me it has no effect. When I manually execute it in a terminal window by typing ~/.bash_profile it works, but it won't run automatically when I log in. I'm running Ubuntu 11.04. The file permission on my .bash_profile is -rwx------. No .bash_profile existed in my home directory before I created it today. I seem to remember older versions of Linux having a .profile file for each user, but that doesn't work either. How is it done? Do I need to configure something else to get the .bash_profile to work? Or does the per-user login script need to be in some other file?

    Read the article

  • Is there a way to change the string format for an existing CSR "Country Code" field from UTF8 to Printable String?

    - by Mike B
    CentOS 5.x The short version: Is there a way to change the encoding format for an existing CSR "Country Code" field from UTF8 to Printable String? The long version: I've got a CSR generated from a product using standard java security providers (jsse/jce). Some of the information in the CSR uses UTF8 Strings (which I understand is the preferred encoding requirement as of December 31, 2003 - RF 3280). The certificate authority I'm submitting the CSR to explicitly requires the Country Code to be specified as a PrintableString. My CSR has it listed as a UTF8 string. I went back to the latest RFC - http://www.ietf.org/rfc/rfc5280.txt. It seems to conflict specifically on countryName. Here's where it gets a little messy... The countryName is part of the relative DN. The relative DN is defined to be of type DirectoryString, which is defined as a choice of teletexString, printableString, universalString, utf8String, or bmpString. It also more specifically defines countryName as being either alpha (upper bound 2 bytes) or numeric (upper bound 3 bytes). Furthermore, in the appendix, it refers to the X520countryName, which is limited to be only a PrintableString of size 2. So, it is clear why it doesn't work. It appears that the certificate authority and Sun/Java do not agree on their interpretation of the requirements for the countryName. Is there anything I can do to modify the CSR to be compatible with the CA?

    Read the article

  • Moving distribution lists in Public Folders to Sharepoint Contacts?

    - by Mike
    Now I know that if I connect to contact list in Sharepoint and drag and drop everything from the Exchange Public Folder contact list to the Sharepoint Contact list (connected through Outlook) it will transfer everything in the contact list to the sharepoint contact list. What about distribution lists? Has anyone had a workaround for this? If a contact list is full of distribution lists the Contacts won't migrate over and the Sync Issues - Local Failures folder is populated with all the distribution lists that couldn't be migrated. Is there a way to migrate distribution lists? Any ideas how to set up the sharepoint contact list like a public folder contact list of distrubition lists? How would that contact list look on sharepoint? Should I just leave the contact lists that have distribution lists on public folders?

    Read the article

  • Problem with SQL connection on virtualised machines

    - by Mike Selby
    We have just virtualised our web servers but are experiencing problems when trying to connect with the existing DB server. The error message that is being returned from the virtual machines is as follows: A connection was successfully established with the server, but then an error occurred during the login process. (provider: TCP Provider, error: 0 - The specified network name is no longer available.) Any help would be much appreciated - Many thanks in advance.

    Read the article

  • MacBook Pro screen goes dark

    - by Mike M
    I've had my MacBook Pro for two years now; no problems so far (it has had 3rd party RAM from the get go). Today, I'm copying a particularly large VM from an External disk drive to local MacBook disk. It has about 3GB to go and I take off to do some other things and when I come back my screen is "dark". The computer is still on but I can't see anything. I forced a reboot by holding down the power button, it starts up with the "chimes", but still no screen. I've done this several times. Any ideas? Do you think the hard disk activity caused it to get too hot?

    Read the article

  • Explorer: programmatically select file/directory with space in the path

    - by Mike L.
    When I try to select a file or directory which has a space in its path in the Windows Explorer, it selects a completely different directory: explorer.exe "/select,C:\Program Files\foobar" I've tried it from Java with Runtime.getRuntime().exec(new String[] { "explorer.exe", "/select," + filePath }); and with the above command line. In both cases, the same result. What can I do to solve the problem?

    Read the article

  • Sharepoint site settings add on ssl port number?

    - by Mike
    WSS 3.0 IIS6/WinSever2003 CAG We have several WSS sites on a SharePoint WSS box that talk to the outside, all of which are SSL enabled. So you get a CAG(Citrix Access Gateway) to translate the 443 port to the local ssl port on the server. Everything is set up and works fine until you get into the Site Settings and start rooting around, it seems like a very unstable link library. Links will try to use the local ssl port number instead of the 443 standard; it will try to skip the step. Is that the site? Any ideas on how to fix it?

    Read the article

  • Multiple Instances of aspnet_wp.exe?

    - by Mike Keller
    Is this normal? I was poking around Google and from what I found it should be, but we have 60 of them going, and only two .NET apps running on this particular server. All of them are at 0% CPU and at about 3meg of memory usage. We can't really restart the server right now to see if that clears them out, we store a lot of data on it that people need regularly. Any clues for investigating this issue would be appreciated. This particular box is an oldie Windows2000 and .NET 2.0 Thanks.

    Read the article

  • Setup Firefox to save .pages as .zip automatically

    - by Mike Dtrick
    What do I want to do? I would like Firefox to save files with the .pages extension as .zip files automatically. Scenario You are browsing through your emails and you notice your friend just sent you an email with a file attached (a .pages in this example). Unfortunately, you have a laptop that runs Windows. Your friend continues to send tons of emails with .pages files attached and you are tired of manually saving the files as a .zip file. Ultimately, you would like Firefox to be set up so that the download/file manager recognizes the .pages extension and automatically converts it to a .zip file. What have I done? I have saved files manually by selecting save as "All Files" and setting the extension to .zip. I've gone through Firefox and their documentation and have not found anything on how to complete this task. Why am I doing this? To save time (only a few seconds, not the main reason). I would like to setup a simple solution that "converts" a file automatically without having to recall steps on how to achieve the task manually (for clients who aren't exactly tech savvy). So that clients with Windows can access the files. IMPORTANT NOTE: I am not trying to save the web page, rather an Apple document equivalent to Microsoft Word. UPDATE: The really easy method would be to save one file, right click it, choose properties and open all .pages files up with WinRAR (or any other program that extracts files from a compressed folder). For the sake of learning, I am going to "neglect" this method and continue to do some research on Firefox add-ons. I would still like to have Firefox or the download manager to do the bulk of the work for converting the file.

    Read the article

  • Sabnzbd Installed on Linux NAS

    - by Mike Szp.
    I installed SABnzbd on a linux formatted NAS. Now the directory it downloads to is mapped differently on the NAS itself, because the path that SABnzbd knows about starts in it's own folder. If this sounds confusing let me give you an example: \\MYNAS\Volume_1\ That is the path of the drive on the NAS. I would like my SABnzbd downloads to go to: \\MYNAS\Volume_1\Downloads Right now SABnzbd is installed to: \\MYNAS\Volume_1\ffp\opt\optware\share\SABnzbd And the default download directory (as indicated in SABnzbd is): /ffp/opt/optware/share/SABnzbd/downloads/complete I know that the mapping is different somehow because It is installed on the NAS, but I just am lost as to what I should do. So far, I have tried for the complete folder: /192.168.restofip/Volume_1/downloads/complete /Volumes/Volume_1/downloads/complete /Volume_1/downloads/complete Does anyone know how to change the path so that I can have it download to one of the topmost folders on the NAS instead of having it download to a folder so deep in the drive?

    Read the article

  • Bluetooth radio device is not available

    - by Mike
    I reinstalled my Windows 7 operating system, and have since been unable to detect my Logitech m555b bluetooth mouse. Here is some info about my system: I have have a working connection with a bluetooth printer. I have a message in the "My Bluetooth" section of explorer stating "Bluetooth radio device is not available". The Device Manager indicates that the Generic Bluetooth radio is working correctly. The bluetooth "F12" switch is on. Mouse batteries are new and the right way around. I'm presuming that I need a bluetooth driver which is non generic. The computer is a Clevo P150HMx (the version with the GTX485 GPU) with the default WLAN / bluetooth combo card manufactured by Realtek (I think it's a Realtek RTL8188CE card). I think I have the drivers installed, but I still get the generic driver in the Device Manager. I'm confused. Please help, I'm going mad on the touchpad. (Thanks for the touchup wizlog)

    Read the article

  • CentOS 6 - YUM Local Repo - Ensure consistent package distribution

    - by Mike Purcell
    I've read a few guides outlining how to setup a local YUM repo, but none of them explicitly stated an answer to my question; If I set up a local YUM repo, does that mean that any CentOS servers which pull from said repo will never be "ahead" of the local YUM repo? I want to ensure a consistent package distribution across all my servers. Right now, when I do a yum update, even on a daily basis, the servers can be out of alignment. For example if I run YUM update on my dev server in the morning, then run YUM update on one of my production servers in the afternoon, the production server may have picked up a new version of a package that the dev server did not pick up, due to the time window between the update commands. Rather, I'd prefer that I run yum update from my dev server which has access to remote upstream yum repos, then let it sit for 2 weeks, after which I run yum update on my production servers against the local repo on my dev server.

    Read the article

  • Completely remove MySQL from macbook pro

    - by mike
    Im prety sure i completely removed mysql from my system, except for one thing. When i type mysql in the command line i get this bash: /opt/local/bin/mysql5: No such file or directory How is it still recognizing where it thinks mysql should be? I'm trying to build it myself in /usr/local, and when i do install it there, i still get that error message for it looking for it in opt/local.

    Read the article

  • iMac memory limit

    - by Mike
    I have an iMac that was from the first generation of aluminum iMacs. The reported model is "iMac 7,1". This iMac's manual says I can put 2 2GB modules, but when this manual was made we don't have modules with more than 2GB and also we had Leopard then, that I suppose can handle less memory than Snow leopard. Today we have 4GB modules, so can I put two 4GB modules and make it 8GB? thanks.

    Read the article

  • Why does my 5.1 surround work in testing only?

    - by Mike Pateras
    I've got a 5.1 speaker setup. In both this SoundMax utility (I think it came with my motherboard), and in the Windows 7 sound test, all 5.1 speakers work properly, but that's the only time that I can get audio to come out of the rear speakers and the center channel. When playing games, video, music, etc., I only seem to get 2.1 speakers worth of sound, even though I've configured everything for 5.1 surround sound. How can I get my 5.1 surround sound working during actual use?

    Read the article

  • Enabled Network Discovery on Server, and now VNC and Squeezebox clients don't work

    - by Mike Hanson
    I've recently setup a Windows Server 2008. It's running an email server, Squeezebox server, MS SQL Server, etc. I'm doing remote maintenance with UltraVNC. I had everything working fine. Then the server needed to access a network share on another machine, and I was prompted to turn on network discovery, which I did. I chose the Home rather than Public option. Since doing that, some things have stopped working, while others are still fine. Shared folders and the the Email services (ports 25 and 110) are still accessible. VNC (port 5900) and Squeezeboxes (port 9000) no longer work. Here's what I've tried to try to solve the problem: Checked the network discovery settings, to see if anything looked strange. Checked the firewall settings, and those ports appear to be open. Also in the firewall settings, the entries for Private domain Network Discovery were all on, but the Domain/Public ones were off. I tried turning those on. In the services, turned on Function Discovery Resource Publication and SSDP Discovery. Any other suggestions?

    Read the article

  • Monitor mode 802.11 captures on OSX

    - by Mike A
    I'm trying to determine the difference between capturing 802.11 frames in the following ways on OSX (10.8.5). It's a bit esoteric, but I use "Option 2" to capture frames for later analysis, and am wondering if I'm missing something. Option 1: use "airportd": $sudo /usr/libexec/airportd en0 sniff Option 2: use "airport" followed by tcpdump: sudo /System/Library/PrivateFrameworks/Apple80211.framework/Versions/Current/Resources/airport --channel= sudo tcpdump -I -P -i en0 -w /tmp/capture.pcap (or alternatvely eliminate the -w and watch packets real-time). From what I can tell: Both commands, according to the wifi icon on OSX, put the interface into 'monitor' mode. Both commands output a pcap file that is readable in both wireshark/tcpdump & Eye PA. Both commands appear to capture management, control and data frames. The rub: Option 1 disconnects you from the network. This is expected, when putting an interface into 'monitor' mode. Option 2 does NOT disconnect you, provided you've set the channel to the same channel your currently connected to. This has a distinct advantage of keeping your connection up while capturing in monitor mode. My question: Option 2 does not seem like it should work, or more specifically, it does not seem like I should be able to remain connected while also capturing frames in monitor mode. On a wired NIC, you can be 'promiscuous' and still send frames, though I didn't think the same was true for wireless NIC. I'm questioning the validity of capturing frames w/ Option 2?

    Read the article

  • Puppet - Is it possible to use a global var to pull in a template with the same name?

    - by Mike Purcell
    I'm new to puppet. As such I am trying to work my way around the best way to setup my manifests that make sense. Following the DRY (don't repeat yourself) principle, I am trying to load common directives in one template, then load in environment specific directives from a file matching the environment. Basically like this: # nodes.pp node base_dev { $service_env = 'dev' } node 'service1.ownij.lan' inherits base_dev { include global_env_specific } class global_env_specific { include shell::bash } # modules/shell/bash.pp class shell::bash inherits shell { notify{"Service env: ${service_env}": } file { '/etc/profile.d/custom_test.sh': content => template('_global/prefix.erb', 'shell/bash/global.erb', 'shell/bash/$service_env.erb'), mode => 644 } } But every time I run puppet agent --test puppet complains that it can't find the shell/bash/$service_env.erb file, but I double checked that it exists. I know the var is accessible due to the notify statement outputting the expected value, so I suspect I am doing which is not allowed. I know I could have a single template.erb and pass variables to the template, which would work in this case because the custom.sh file is small and not many changes across environments, but for more complex configs (httpd, solr, etc) I'd prefer to access environment specific files. I am also aware that I can specify environment specific module paths, but I'd prefer to just handle this behavior at the template level, instead of having several, closely named directories. Thanks.

    Read the article

  • syslog ip ranges to specific files using `rsyslog`

    - by Mike Pennington
    I have many Cisco / JunOS routers and switches that send logs to my Debian server, which uses rsyslogd. How can I configure rsyslogd to send these router / switch logs to a specific file, based on their source IP address? I do not want to pollute general system logs with these entries. For instance: all routers in Chicago (source ip block: 172.17.25.0/24) to only log to /var/log/net/chicago. all routers in Dallas (source ip block 172.17.27.0/24) to only log to /var/log/net/dallas. Finally, these logs should be rotated daily for up to 30 days and compressed. NOTE: I am answering my own question

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >