Search Results

Search found 20168 results on 807 pages for 'service'.

Page 532/807 | < Previous Page | 528 529 530 531 532 533 534 535 536 537 538 539  | Next Page >

  • Cannot install any Feature/Role on Windows Server 2008 R2 Standard

    - by Parsa
    I was trying to install Exchange 2010 prerequisites, when I encountered some error. All look like the same. Like this one: Error: Installation of [Windows Process Activation Service] Configuration APIs failed. the server needs to be restarted to undo the changes. My server is running Windows Server 2008 R2 Standard Edition. UPDATE: I tried installing the prerequisites one by one using PowerShell. Now I have errors on RPC over Http proxy: Installation of [Web Server (IIS)] Tracing Failed, Attempt to install tracing failed with error code 0x80070643. Fatal error during installation. Searching about the error code doesn't tell me much more than something went wrong when trying to update windows. Installing Http Tracing alone also doesn't make any difference.

    Read the article

  • Unable to connect to local SQL Express (2008)

    - by Will
    I'm trying to get an installation of SQL Express 2008 working, but no matter what I do I can't connect to it with management studio. I've enabled TCP/IP for the instance. I've tried connecting with machinename\instance, .\instance, (local), etc etc etc. Nothing works, and I always get the same message. If I browse for a server, the only server listed is the local integration services, which I can connect to just fine. The SQL server service is running (SQLEXPRESS), SQL Server Agent is not (and won't, can't enable it), and SQL Server Browser is. Anyone have any suggestions on where to go next? I've tried uninstalling EVERYTHING sql and reinstalling, no change.

    Read the article

  • internet access options in remote area? (read: no comcast,qwest, etc)

    - by freedrull
    Currently I am living in a fairly "remote" area, in the countryside, and cable internet access through the typical companies like comcast and qwest is just not available here. I've been trying to research other options for fast internet access. There are some small cable companies but they currently do not offer broadband access here. I thought about maybe buying a 3g phone with a data plan and doing some sort of tethering, or perhaps getting an android phone and using it as a wireless AP. This would of course depend on 3g being available here. The only other thing I can think of is some sort of satellite internet service, or doing something crazy like adapting wifi over am radio. Anyone have any ideas, at all, short of moving somewhere else?

    Read the article

  • Tomcat fails to start, no logs or error provided

    - by Alex Kuhl
    I have a Centos5 box running tomcat5 (a version before 5.5, there's no bin/version.sh script). When attempting to start tomcat, whether through init.d or service, I get the FAILED message with no other information provided. The date on catalina.out changes but it has no contents and is 0 bytes. logging.conf has not been edited and everything is marked as FINE detail. Has anyone experienced this and know of a solution? Or, failing that, how can I get some log/error info from tomcat to try to pinpoint the issue?

    Read the article

  • VSFTPD Unable to set write permissions on folder

    - by Frank Astin
    I've just set up my first FTP server with VSFTPD on cent os . I can connect to it fine using a user in the group ftp-users but I get read only access . I've tried several different CHMOD codes on the folder (even 777) all to no avail . This is the tutorial I used to set up the server http://tinyurl.com/73pyuxz hopefully you'll be able to see something I missed. Thanks in advance . Requested Config File : # Example config file /etc/vsftpd/vsftpd.conf # # The default compiled in settings are fairly paranoid. This sample file # loosens things up a bit, to make the ftp daemon more usable. # Please see vsftpd.conf.5 for all compiled in defaults. # # READ THIS: This example file is NOT an exhaustive list of vsftpd options. # Please read the vsftpd.conf.5 manual page to get a full idea of vsftpd's # capabilities. # # Allow anonymous FTP? (Beware - allowed by default if you comment this out). anonymous_enable=NO # # Uncomment this to allow local users to log in. local_enable=YES # # Uncomment this to enable any form of FTP write command. write_enable=YES # # Default umask for local users is 077. You may wish to change this to 022, # if your users expect that (022 is used by most other ftpd's) local_umask=022 # # Uncomment this to allow the anonymous FTP user to upload files. This only # has an effect if the above global write enable is activated. Also, you will # obviously need to create a directory writable by the FTP user. #anon_upload_enable=YES # # Uncomment this if you want the anonymous FTP user to be able to create # new directories. #anon_mkdir_write_enable=YES # # Activate directory messages - messages given to remote users when they # go into a certain directory. dirmessage_enable=YES # # The target log file can be vsftpd_log_file or xferlog_file. # This depends on setting xferlog_std_format parameter xferlog_enable=YES # # Make sure PORT transfer connections originate from port 20 (ftp-data). connect_from_port_20=YES # # If you want, you can arrange for uploaded anonymous files to be owned by # a different user. Note! Using "root" for uploaded files is not # recommended! #chown_uploads=YES #chown_username=whoever # # The name of log file when xferlog_enable=YES and xferlog_std_format=YES # WARNING - changing this filename affects /etc/logrotate.d/vsftpd.log #xferlog_file=/var/log/xferlog # # Switches between logging into vsftpd_log_file and xferlog_file files. # NO writes to vsftpd_log_file, YES to xferlog_file xferlog_std_format=YES # # You may change the default value for timing out an idle session. #idle_session_timeout=600 # # You may change the default value for timing out a data connection. #data_connection_timeout=120 # # It is recommended that you define on your system a unique user which the # ftp server can use as a totally isolated and unprivileged user. #nopriv_user=ftpsecure # # Enable this and the server will recognise asynchronous ABOR requests. Not # recommended for security (the code is non-trivial). Not enabling it, # however, may confuse older FTP clients. #async_abor_enable=YES # # By default the server will pretend to allow ASCII mode but in fact ignore # the request. Turn on the below options to have the server actually do ASCII # mangling on files when in ASCII mode. # Beware that on some FTP servers, ASCII support allows a denial of service # attack (DoS) via the command "SIZE /big/file" in ASCII mode. vsftpd # predicted this attack and has always been safe, reporting the size of the # raw file. # ASCII mangling is a horrible feature of the protocol. #ascii_upload_enable=YES #ascii_download_enable=YES # # You may fully customise the login banner string: #ftpd_banner=Welcome to blah FTP service. # # You may specify a file of disallowed anonymous e-mail addresses. Apparently # useful for combatting certain DoS attacks. #deny_email_enable=YES # (default follows) #banned_email_file=/etc/vsftpd/banned_emails # # You may specify an explicit list of local users to chroot() to their home # directory. If chroot_local_user is YES, then this list becomes a list of # users to NOT chroot(). #chroot_list_enable=YES # (default follows) #chroot_list_file=/etc/vsftpd/chroot_list # # You may activate the "-R" option to the builtin ls. This is disabled by # default to avoid remote users being able to cause excessive I/O on large # sites. However, some broken FTP clients such as "ncftp" and "mirror" assume # the presence of the "-R" option, so there is a strong case for enabling it. #ls_recurse_enable=YES # # When "listen" directive is enabled, vsftpd runs in standalone mode and # listens on IPv4 sockets. This directive cannot be used in conjunction # with the listen_ipv6 directive. listen=YES # # This directive enables listening on IPv6 sockets. To listen on IPv4 and IPv6 # sockets, you must run two copies of vsftpd whith two configuration files. # Make sure, that one of the listen options is commented !! #listen_ipv6=YES pam_service_name=vsftpd userlist_enable=YES tcp_wrappers=YES

    Read the article

  • Cannot start tor with vidalia, failed to bind listening port because of tor-socks running

    - by ganjan
    I get these errors trying to run tor with vidalia Apr 19 21:55:15.371 [Notice] Tor v0.2.1.30. This is experimental software. Do not rely on it for strong anonymity. (Running on Linux i686) Apr 19 21:55:15.372 [Notice] Initialized libevent version 1.4.13-stable using method epoll. Good. Apr 19 21:55:15.373 [Notice] Opening Socks listener on 127.0.0.1:9050 Apr 19 21:55:15.373 [Warning] Could not bind to 127.0.0.1:9050: Address already in use. Is Tor already running? Apr 19 21:55:15.373 [Warning] Failed to parse/validate config: Failed to bind one of the listener ports. Apr 19 21:55:15.373 [Error] Reading config failed--see warnings above. I don't think tor is running. Here is a nmap scan of my localhost Starting Nmap 5.21 ( http://nmap.org ) at 2011-04-19 21:59 CEST Nmap scan report for localhost (127.0.0.1) Host is up (0.0000050s latency). Hostname localhost resolves to 2 IPs. Only scanned 127.0.0.1 rDNS record for 127.0.0.1: localhost.localdomain Not shown: 989 closed ports PORT STATE SERVICE 22/tcp open ssh 53/tcp open domain 80/tcp open http 139/tcp open netbios-ssn 445/tcp open microsoft-ds 631/tcp open ipp 3128/tcp open squid-http 3306/tcp open mysql 9000/tcp open cslistener 9050/tcp open tor-socks 10000/tcp open snet-sensor-mgmt I see tor-socks is running here, probably be the cause of the problem. How do I stop this from starting up? I want to use vidalia so I can monitor whats going on.

    Read the article

  • Suggestion for Colocation?

    - by John M.
    Anyone has a suggestion for a good colocation company? I just don't know much about who is the best colocation provider. I have been using EC2 and S3 and it is becoming really expensive due to the bandwidth cost. Especially, people downloding contents from S3. I have looked at Peer1 and other few companies. I am not sure why they don't list their price on their website for their colocation service. They force you to chat with them and waste your time.

    Read the article

  • Suggestion for Colocation?

    - by John M.
    Anyone has a suggestion for a good colocation company? I just don't know much about who is the best colocation provider. I have been using EC2 and S3 and it is becoming really expensive due to the bandwidth cost. Especially, people downloding contents from S3. I have looked at Peer1 and other few companies. I am not sure why they don't list their price on their website for their colocation service. They force you to chat with them and waste your time.

    Read the article

  • Data and Secularism

    - by kaleidoscope
    Ever since we’ve been using Data we’ve been religious. Religious about the way we represent it and equally religious about the way we access it. Be it plain old SQL, DAO, ADO, ADO.Net and I am just referring to religions in MSFT world. A peek outside and I’d need a separate book to list out the Data faiths. Various application areas in networked computing are converging under the HTTP umbrella with a plausible transition to purist HTTP and in turn REST fuelled by the Web2.0 storm. It was time the Data access faiths also gave up the religious silos wrapped around our long worshipped data publishing and access methods. OData is the secular solution we have at hand today. It is an open protocol for sharing data. It can be exposed via REST. It is Open as in the Microsoft Open Specification Promise. This allows virtually everyone to build Data Services for any runtime. OData is one of the key standards for Data publishing/subscribing on Microsoft Codename Dallas. For us .Netters OData data sources can be exposed/consumed via WCF Data Services and the process is very simple, elegant and intuitive. Applications exposing OData Services Sharepoint 2010 IBM Web Sphere Microsoft SQL Azure Windows Azure Table Storage SQL Server Reporting Services   Live OData Services Netflix Open Science Data Initiative Open Government Data Initiatives Northwind database exposed as OData Service and many others Some may prefer to call it commoditization of data, unification of data access strategies or any other sweet name. I for one will stick to my secular definition. :) Technorati Tags: Sarang,OData,MOSP

    Read the article

  • Active Directory Password Policy Problem

    - by Will
    To Clarify: my question is why isn't my password policy applying to people in the domain. Hey guys, having trouble with our password policy in Active Directory. Sometimes it just helps me to type out what I’m seeing It appears to not be applying properly across the board. I am new to this environment and AD in general but I think I have a general grasp of what should be going on. It’s a pretty simple AD setup without too many Group Policies being applied. It looks something like this DOMAIN Default Domain Policy (link enabled) Password Policy (link enabled and enforce) Personal OU Force Password Change (completely empty nothing in this GPO) IT OU Lockout Policy (link enabled and enforced) CS OU Lockout Policy Accouting OU Lockout Policy The password policy and default domain policy both define the same things under Computer ConfigWindows seetings sec settings Account Policies / Password Policy Enforce password History : 24 passwords remembered Maximum Password age : 180 days Min password age: 14 days Minimum Password Length: 6 characters Password must meet complexity requirements: Enabled Store Passwords using reversible encryption: Disabled Account Policies / Account Lockout Policy Account Lockout Duration 10080 Minutes Account Lockout Threshold: 5 invalid login attempts Reset Account Lockout Counter after : 30 minutes IT lockout This just sets the screen saver settings to lock computers when the user is Idle. After running Group Policy modeling it seems like the password policy and default domain policy is getting applied to everyone. Here is the results of group policy modeling on MO-BLANCKM using the mblanck account, as you can see the policies are both being applied , with nothing important being denied Group Policy Results NCLGS\mblanck on NCLGS\MO-BLANCKM Data collected on: 12/29/2010 11:29:44 AM Summary Computer Configuration Summary General Computer name NCLGS\MO-BLANCKM Domain NCLGS.local Site Default-First-Site-Name Last time Group Policy was processed 12/29/2010 10:17:58 AM Group Policy Objects Applied GPOs Name Link Location Revision Default Domain Policy NCLGS.local AD (15), Sysvol (15) WSUS-52010 NCLGS.local/WSUS/Clients AD (54), Sysvol (54) Password Policy NCLGS.local AD (58), Sysvol (58) Denied GPOs Name Link Location Reason Denied Local Group Policy Local Empty Security Group Membership when Group Policy was applied BUILTIN\Administrators Everyone S-1-5-21-507921405-1326574676-682003330-1003 BUILTIN\Users NT AUTHORITY\NETWORK NT AUTHORITY\Authenticated Users NCLGS\MO-BLANCKM$ NCLGS\Admin-ComputerAccounts-GP NCLGS\Domain Computers WMI Filters Name Value Reference GPO(s) None Component Status Component Name Status Last Process Time Group Policy Infrastructure Success 12/29/2010 10:17:59 AM EFS recovery Success (no data) 10/28/2010 9:10:34 AM Registry Success 10/28/2010 9:10:32 AM Security Success 10/28/2010 9:10:34 AM User Configuration Summary General User name NCLGS\mblanck Domain NCLGS.local Last time Group Policy was processed 12/29/2010 11:28:56 AM Group Policy Objects Applied GPOs Name Link Location Revision Default Domain Policy NCLGS.local AD (7), Sysvol (7) IT-Lockout NCLGS.local/Personal/CS AD (11), Sysvol (11) Password Policy NCLGS.local AD (5), Sysvol (5) Denied GPOs Name Link Location Reason Denied Local Group Policy Local Empty Force Password Change NCLGS.local/Personal Empty Security Group Membership when Group Policy was applied NCLGS\Domain Users Everyone BUILTIN\Administrators BUILTIN\Users NT AUTHORITY\INTERACTIVE NT AUTHORITY\Authenticated Users LOCAL NCLGS\MissingSkidEmail NCLGS\Customer_Service NCLGS\Email_Archive NCLGS\Job Ticket Users NCLGS\Office Staff NCLGS\CUSTOMER SERVI-1 NCLGS\Prestige_Jobs_Email NCLGS\Telecommuters NCLGS\Everyone - NCL WMI Filters Name Value Reference GPO(s) None Component Status Component Name Status Last Process Time Group Policy Infrastructure Success 12/29/2010 11:28:56 AM Registry Success 12/20/2010 12:05:51 PM Scripts Success 10/13/2010 10:38:40 AM Computer Configuration Windows Settings Security Settings Account Policies/Password Policy Policy Setting Winning GPO Enforce password history 24 passwords remembered Password Policy Maximum password age 180 days Password Policy Minimum password age 14 days Password Policy Minimum password length 6 characters Password Policy Password must meet complexity requirements Enabled Password Policy Store passwords using reversible encryption Disabled Password Policy Account Policies/Account Lockout Policy Policy Setting Winning GPO Account lockout duration 10080 minutes Password Policy Account lockout threshold 5 invalid logon attempts Password Policy Reset account lockout counter after 30 minutes Password Policy Local Policies/Security Options Network Security Policy Setting Winning GPO Network security: Force logoff when logon hours expire Enabled Default Domain Policy Public Key Policies/Autoenrollment Settings Policy Setting Winning GPO Enroll certificates automatically Enabled [Default setting] Renew expired certificates, update pending certificates, and remove revoked certificates Disabled Update certificates that use certificate templates Disabled Public Key Policies/Encrypting File System Properties Winning GPO [Default setting] Policy Setting Allow users to encrypt files using Encrypting File System (EFS) Enabled Certificates Issued To Issued By Expiration Date Intended Purposes Winning GPO SBurns SBurns 12/13/2007 5:24:30 PM File Recovery Default Domain Policy For additional information about individual settings, launch Group Policy Object Editor. Public Key Policies/Trusted Root Certification Authorities Properties Winning GPO [Default setting] Policy Setting Allow users to select new root certification authorities (CAs) to trust Enabled Client computers can trust the following certificate stores Third-Party Root Certification Authorities and Enterprise Root Certification Authorities To perform certificate-based authentication of users and computers, CAs must meet the following criteria Registered in Active Directory only Administrative Templates Windows Components/Windows Update Policy Setting Winning GPO Allow Automatic Updates immediate installation Enabled WSUS-52010 Allow non-administrators to receive update notifications Enabled WSUS-52010 Automatic Updates detection frequency Enabled WSUS-52010 Check for updates at the following interval (hours): 1 Policy Setting Winning GPO Configure Automatic Updates Enabled WSUS-52010 Configure automatic updating: 4 - Auto download and schedule the install The following settings are only required and applicable if 4 is selected. Scheduled install day: 0 - Every day Scheduled install time: 03:00 Policy Setting Winning GPO No auto-restart with logged on users for scheduled automatic updates installations Disabled WSUS-52010 Re-prompt for restart with scheduled installations Enabled WSUS-52010 Wait the following period before prompting again with a scheduled restart (minutes): 30 Policy Setting Winning GPO Reschedule Automatic Updates scheduled installations Enabled WSUS-52010 Wait after system startup (minutes): 1 Policy Setting Winning GPO Specify intranet Microsoft update service location Enabled WSUS-52010 Set the intranet update service for detecting updates: http://lavender Set the intranet statistics server: http://lavender (example: http://IntranetUpd01) User Configuration Administrative Templates Control Panel/Display Policy Setting Winning GPO Hide Screen Saver tab Enabled IT-Lockout Password protect the screen saver Enabled IT-Lockout Screen Saver Enabled IT-Lockout Screen Saver executable name Enabled IT-Lockout Screen Saver executable name sstext3d.scr Policy Setting Winning GPO Screen Saver timeout Enabled IT-Lockout Number of seconds to wait to enable the Screen Saver Seconds: 1800 System/Power Management Policy Setting Winning GPO Prompt for password on resume from hibernate / suspend Enabled IT-Lockout

    Read the article

  • Pinnacle Studio 14 Importer crashes on Windows 7 x64

    - by Tuminoid
    Pinnacle Studio 14 Importer crashes on Windows 7 x64, thus I'm unable to import videos from my camcorder. I've Googled around and it seems to be a problem with x64 support in the program, but there is no info if it can somehow be remedied with Compatibility settings, XP mode. Further Googling got me this link, and none of the tips in site it links help. I've got no idea what ObjectDoct Plus 2 is (no a program or service installed by that name) which he claims causes the problem. Is that ObjectDoct critical to Pinnacle Studio and how to get rid of it (or any other advice how to make Studio 14 work on x64). UPDATE: 1) XP mode doesn't help as it doesn't handle Firewire, which is how my Canon HV20 must be connected to transfer HDV from tape. 2) I don't have that ObjectDock installed. 3) I don't wanna re-buy PS15 just to get this solved as PS14 should be able to do it.

    Read the article

  • Extending UPK with Enablement Packs

    - by bill.x.miller
    We've mentioned in earlier posts that UPK Development keeps the tool up to date through the use of Enable Service Packs (ESP'S). Regular releases ensure that the UPK Developer supports updates to targeted applications as well as new Java updates. Installing an ESP is quick and easy. • Download the latest ESP from My Oracle Support (requires a My Oracle Support account). • Run the setup for each client machine that uses the UPK Developer • Run the Library Updates from one of the clients (multi-user only) Enablement Pack 1 for UPK 3.6.1 contains new features such as a new Tabbed Gateway, FireFox 3.6 support for the Player and SmartHelp, and several new target application versions. But a very exciting feature that is part of this ESP is now available to all Oracle E-Business Suite customers. Until now, a requirement for EBS customers who wish to record UPK content is to install delivered library files (CUSTOM.pll and ODPN.pll) on to the Oracle Application Server. These files were required to present context information to the UPK Developer so that content can be launched in a context sensitive manner. This requirement involved the Oracle system administrator to transfer, install and compile these libraries into the system. Usually a simple process, however, we understood the need to streamline the procedure. With ESP 1 for UPK 3.6.1, these pll files are no longer required. Now, a simple procedure from within the EBS application can make context available to the UPK recorder. From the System Profile, search for UPK: Change the Site field to Enable UPK Recording. Save the Form. Context information will now be made available to the UPK Recorder without involving the System Administrator or DBA. The setting you see here makes context available to all client machines recording content with UPK and does not affect the performance of your EBS application.

    Read the article

  • SOA Governance Starts with People and Processes

    - by Jyothi Swaroop
    While we all agree that SOA Governance is about People, Processes and Technology. Some experts are of the opinion that SOA Governance begins with People and Processes but needs to be empowered with technology to achieve the best results. Here's an interesting piece from David Linthicum on eBizq: In the world of SOA, the concept of SOA governance is getting a lot of attention. However, how SOA governance is defined and implemented really depends on the SOA governance vendor who just left the building within most enterprises. Indeed, confusion is a huge issue when considering SOA governance, and the core issues are more about the fundamentals of people and processes, and not about the technology. SOA governance is a concept used for activities related to exercising control over services in an SOA, including tracking the services, monitoring the service, and controlling changes made to the services, simple put. The trouble comes in when SOA governance vendors attempt to define SOA governance around their technology, all with different approaches to SOA governance. Thus, it's important that those building SOAs within the enterprise take a step back and understand what really need to support the concept of SOA governance. The value of SOA governance is pretty simple. Since services make up the foundation of an SOA, and are at their essence the behavior and information from existing systems externalized, it's critical to make sure that those accessing, creating, and changing services do so using a well controlled and orderly mechanism. Those of you, who already have governance in place, typically around enterprise architecture efforts, will be happy to know that SOA governance does not replace those processes, but becomes a mechanism within the larger enterprise governance concept. People and processes are first thing on the list to get under control before you begin to toss technology at this problem. This means establishing an understanding of SOA governance within the team members, including why it's important, who's involved, and the core processes that are to be follow to make SOA governance work. Indeed, when creating the core SOA governance strategy should really be independent of the technology. The technology will change over the years, but the core processes and discipline should be relatively durable over time.

    Read the article

  • Multiple computers on Ubuntu One

    - by L R Bellmore Jr
    I have added files from 4 computers to Ubuntu One. One computer failed. What happens to the files that I uploaded. Are they still on Ubuntu One? How come when I upload a file from computer A, computer B with the same Ubuntu One account does not sync and load that file into computer B - That has created this question.. what happens to files from computers from which I uploaded documents when those computers are no longer active or failed, or no longer have Ubuntu One on that computer from which the files were uploaded... Have I lost the files? The followup question is how come files from Computer A uploaded to Ubuntu One don't sync to Computer B. That is a related question. I need to shut down a computer, reformat the hard drive and install Linux on the entire harddrive instead of a dual boot with Windows XP as I am going to use Virtual Machines instead.. what happens to the files from the Windows Dual Boot that were uploaded to Ubuntu One? Are they removed from Ubuntu One and then have I lost those files if I don't backup to another service first.

    Read the article

  • TFS Shipping Cadence

    - by Tarun Arora
    Brian Harry has formally announced a change to the TFS shipping cadence from the traditional 2-3 years to production cycle to a more agile and refreshing minimum once in a 3 weeks cycle! The change didn’t happen over night, it was a gradual process which was greatly influenced by moving TFS to the cloud. The thinking started with trying to figure out what the team wanted.  Like people often do, the team started with what they knew and tried to evolve from there.  The team spent a few months thinking through “What if we do major releases every year and minor releases every 6 months?”,  “Major releases every 6 months, patches once a month?”, “What if we do quarterly releases – can we get the release cycle going that fast?”, etc.  The team also spent time debating what constitutes a major release VS a minor release. How much churn are customers willing to tolerate?  The team finally concluded… “When a change this big is necessary – forget where you are and just ask where you want to be and then ask what it would take to get there.” Going forward you will see, Team Foundation Service updates every 3 weeks Visual Studio Client updates quarterly (Limited to VS 2012 for now) Team Foundation Server updates more frequent than every 2 years but details still being worked out.  The team will definitely deliver one this fall. Refer to the complete blog post from Brian Harry here.

    Read the article

  • Can I migrate from GNU Mailman to MailChimp?

    - by Flowpoke
    I have ~20 lists, all of which are mostly announce-only (newsletters--recipients do not reply back to the list) running in GNU Mailman. It's alright. Mailman has certainly prooven itself but we want some progressive features and a better peice of mind (delivery success, hosting, etc... we'd be happy paying a 3rd party to handle these things). can MailChimp give us what we need? I see tons of fun copy and graphics, showing off neat features but what I really want to do is; if MailChimp is doing the mailings, what does the address look like? is MailChimp good for sending out simple newsletters? What about automatic bounce processing / unsubscribing of users? I setup a free account but I don't see how any of it integrates into my own domain... no DNS overrides or cname suggestions. Also, I see MailChimp has a clean and nifty API client in Python that I want to integrate into our sites (Django powered) so that really really makes the service attractive to me--I just hope I understand it correctly.

    Read the article

  • Site too large to officially use Google Analytics?

    - by Jeff Atwood
    We just got this email from the Google Analytics team: We love that you love our product and use it as much as you do. We have observed however, that a website you are tracking with Google Analytics is sending over 1 million hits per day to Google Analytics servers. This is well above the "5 million pageviews per month per account" limit specified in the Google Analytics Terms of Service. Processing this amount of data multiple times a day takes up valuable resources that enable us to continue to develop the product for all Google Analytics users. Processing this amount of data multiple times a day takes up valuable resources that enable us to continue to develop the product for all Google Analytics users. As such, starting August 23rd, 2010, the metrics in your reports will be updated once a day, as opposed to multiple times during the course of the day. You will continue to receive all the reports and features in Google Analytics as usual. The only change will be that data for a given day will appear the following day. We trust you understand the reasons for this change. I totally respect this decision, and I think it's very generous to not kick us out. But how do we do this the right way -- what's the official, blessed Google way to use Google Analytics if you're a "whale" website with lots of hits per day? Or, are there other analytics services that would be more appropriate for very large websites?

    Read the article

  • vsftpd not allowing uploads. 550 response

    - by Josh
    I've set vsftpd up on a centos box. I keep trying to upload files but I keep getting "550 Failed to change directory" and "550 Could not get file size." Here's my vsftpd.conf # The default compiled in settings are fairly paranoid. This sample file # loosens things up a bit, to make the ftp daemon more usable. # Please see vsftpd.conf.5 for all compiled in defaults. # # READ THIS: This example file is NOT an exhaustive list of vsftpd options. # Please read the vsftpd.conf.5 manual page to get a full idea of vsftpd's # capabilities. # # Allow anonymous FTP? (Beware - allowed by default if you comment this out). anonymous_enable=YES # # Uncomment this to allow local users to log in. local_enable=YES # # Uncomment this to enable any form of FTP write command. write_enable=YES # # Default umask for local users is 077. You may wish to change this to 022, # if your users expect that (022 is used by most other ftpd's) local_umask=022 # # Uncomment this to allow the anonymous FTP user to upload files. This only # has an effect if the above global write enable is activated. Also, you will # obviously need to create a directory writable by the FTP user. anon_upload_enable=YES # # Uncomment this if you want the anonymous FTP user to be able to create # new directories. anon_mkdir_write_enable=YES anon_other_write_enable=YES # # Activate directory messages - messages given to remote users when they # go into a certain directory. dirmessage_enable=YES # # The target log file can be vsftpd_log_file or xferlog_file. # This depends on setting xferlog_std_format parameter xferlog_enable=YES # # Make sure PORT transfer connections originate from port 20 (ftp-data). connect_from_port_20=YES # # If you want, you can arrange for uploaded anonymous files to be owned by # a different user. Note! Using "root" for uploaded files is not # recommended! #chown_uploads=YES #chown_username=whoever # # The name of log file when xferlog_enable=YES and xferlog_std_format=YES # WARNING - changing this filename affects /etc/logrotate.d/vsftpd.log #xferlog_file=/var/log/xferlog # # Switches between logging into vsftpd_log_file and xferlog_file files. # NO writes to vsftpd_log_file, YES to xferlog_file xferlog_std_format=NO # # You may change the default value for timing out an idle session. #idle_session_timeout=600 # # You may change the default value for timing out a data connection. #data_connection_timeout=120 # # It is recommended that you define on your system a unique user which the # ftp server can use as a totally isolated and unprivileged user. #nopriv_user=ftpsecure # # Enable this and the server will recognise asynchronous ABOR requests. Not # recommended for security (the code is non-trivial). Not enabling it, # however, may confuse older FTP clients. #async_abor_enable=YES # # By default the server will pretend to allow ASCII mode but in fact ignore # the request. Turn on the below options to have the server actually do ASCII # mangling on files when in ASCII mode. # Beware that on some FTP servers, ASCII support allows a denial of service # attack (DoS) via the command "SIZE /big/file" in ASCII mode. vsftpd # predicted this attack and has always been safe, reporting the size of the # raw file. # ASCII mangling is a horrible feature of the protocol. #ascii_upload_enable=YES #ascii_download_enable=YES # # You may fully customise the login banner string: #ftpd_banner=Welcome to blah FTP service. # # You may specify a file of disallowed anonymous e-mail addresses. Apparently # useful for combatting certain DoS attacks. #deny_email_enable=YES # (default follows) #banned_email_file=/etc/vsftpd/banned_emails # # You may specify an explicit list of local users to chroot() to their home # directory. If chroot_local_user is YES, then this list becomes a list of # users to NOT chroot(). #chroot_list_enable=YES # (default follows) #chroot_list_file=/etc/vsftpd/chroot_list # # You may activate the "-R" option to the builtin ls. This is disabled by # default to avoid remote users being able to cause excessive I/O on large # sites. However, some broken FTP clients such as "ncftp" and "mirror" assume # the presence of the "-R" option, so there is a strong case for enabling it. #ls_recurse_enable=YES # # When "listen" directive is enabled, vsftpd runs in standalone mode and # listens on IPv4 sockets. This directive cannot be used in conjunction # with the listen_ipv6 directive. listen=YES # This directive enables listening on IPv6 sockets. To listen on IPv4 and IPv6 # sockets, you must run two copies of vsftpd whith two configuration files. # Make sure, that one of the listen options is commented !! #listen_ipv6=YES pam_service_name=vsftpd userlist_enable=YES tcp_wrappers=YES log_ftp_protocol=YES banner_file=/etc/vsftpd/issue local_root=/var/www guest_enable=YES guest_username=ftpusr ftp_username=nobody

    Read the article

  • Shareing two internet connections on my laptop running Windows XP

    - by ashwnacharya
    I have two internet connections, one is internet via our organization's corporate LAN network, and the other one is mobile broadband via a USB modem Is there anyway I can share internet connections and use them simultaneously? I want to use the corporate LAN network for normal browsing and connecting my email client, and I want to use the USB modem for establishing a VPN connection. Will I be able to maintain both the connections simultaneously? Can I have parallel downloads, one using our corporate network, and the other one using the mobile broadband? Will I be able to switch my browser between these two connections? My laptop runs Windows XP Service Pack 2.

    Read the article

  • BizTalk 2009 - BizTalk Server Best Practice Analyser

    - by StuartBrierley
    The BizTalk Server Best Practices Analyser  allows you to carry out a configuration level verification of your BizTalk installation, evaluating the deployed configuration but not modifying or tuning anything that it finds. The Best Practices Analyser uses "reading and reporting" to gather data from different sources, such as: Windows Management Instrumentation (WMI) classes SQL Server databases Registry entries When I first ran the analyser I got a number of errors, if you get any errors these should all be acted upon to resolve them, you should then run the scan again and see if any thing else is reported that needs acting upon. As you can see in the image above, the initial issue that jumped out to me was that the SQL Server Agent was not started. The reasons for this was absent mindedness - this run was against my development PC and I don't have SQL/BizTalk actively running unless I am using them.  Starting the agent service and running the scan again gave me the following results: This resolved most of the issues for me, but next major issue to look at was that there was no tracking host running.  You can also see that I was still getting an error with two of the SQL jobs.  The problem here was that I had not yet configured these two SQL jobs.  Configuring the backup and purge jobs and then starting the tracking host before running the scan again gave: This had cleared all the critical issues, but I did stil have a number of warnings.  For example on this report I was warned that the BizTalk Message box is hosted on the BizTalk Server.  While this is known to be less than ideal, it is as I expected on my development environment where I have installed Visual Studio, SQL and BizTalk on my laptop and I was happy to ignore this and other similar warnings. In your case you should take a look at any warnings you receive and decide what you want to do about each of them in turn.

    Read the article

  • Redirect users to internal webpage on first visit

    - by Sihan Zheng
    I have a lan of maybe 20 - 30 computers, and a Windows server 2003 server on hand (I can also run any x86 Linux distro). What I am trying to do, is to redirect users to a webserver inside the LAN the first time they visit certain domains. For example, the first time a user visits "google.com", they will be redirected to 192.168.1.2 (a webserver, where they will be shown a custom webpage), attempts after that will go to google. Pretty much what I am trying to do, is to provide a captive server like service, showing people a custom webpage the first time they try to access certain websites (but not others). I'm pretty flexable on how this can be done, as long as it works. Can you guys give me an idea on how to approach this problem? I am looking for (hopefully) a free solutions. Thanks

    Read the article

  • Setting default version on Azure Blob Storage?

    - by Erik
    What is the easiest way, without having to create your own utility, to set the default service version to the latest in Azure Blob Storage ? http://msdn.microsoft.com/en-us/library/windowsazure/dd894041 There is basically nothing to be set in the Azure portal and I am having a difficult time finding working utilities to use for Azure. For some reason Azure is defaulting to the oldest version which does not send things like the http range header for example. Any utility that can do this ? Thank you.

    Read the article

  • links for 2010-05-19

    - by Bob Rhubart
    Presentations from #otnarchday in Dallas now available on Slideshare Includes presentations on IT Optimization, Application Integration Architecture, Application Grid, and Infrastructure Consolidation. More to come. Anthony Shorten: JMX Based Monitoring - Part Four - Business App Server Monitoring Anthony Shorten discuss a new Oracle Utilities Application Framework V4 feature that allows JMX to be used for management and monitoring the Oracle Utilities business application server component. (tags: oracle otn java architect) New book: Oracle Coherence 3.5 An overview of the new book by authors Aleksandar Seovic, Mark Falco, Patrick Peralta. (tags: oracle otn grid architect) Douwe Pieter van den Bos: Next step in Virtualization: VirtualBox 3.2 "For businesses, VirtualBox just might be the answer they where looking for," says Douwe Pieter van den Bos. "A simple and widely supported virtual machine." (tags: oracle otn virtualization architect) Maurice Gamanho: Python and Ruby in Tuxedo Maurice Gamanho's quick overview of new features in Oracle's Service Architecture Leveraging Tuxedo (SALT) 11gR1. (tags: oracle otn soa architect) Live Webcast: Oracle and AmberPoint - May 20, 2010 - 10 a.m. PT/1 p.m. ET Ed Horst and Ashish Mohindroo discuss the advantages of the Oracle and AmberPoint combination. (tags: oracle otn architect soa governance)

    Read the article

  • Adaptive Case Management – Exposing the API – part 1 by Roger Goossens

    - by JuergenKress
    One of the most important building blocks of Adaptive Case Management is the ACM API. At one point or another you’re gonna need a way to get information (think about a list of stakeholders, available activities, milestones reached, etc.) out of the case. Since there’s no webservice available yet that exposes the internals of the case, your only option right now is the ACM API. ACM evangelist Niall Commiskey has put some samples online to give you a good feeling of the power of the ACM API. The examples show how you can access the API by means of RMI. You first need to obtain a BPMServiceClientFactory that gives access to the important services you’ll mostly be needing, i.e. the IBPMUserAuthenticationService (needed for obtaining a valid user context) and the ICaseService (the service that exposes all important case information). Now, obtaining an instance of the BPMServiceClientFactory involves some boilerplate coding in which you’ll need the RMI url and user credentials: Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: ACM,API,Adaptive Case Management,Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

    Read the article

  • Welcome to the Weblog on Oracle ADF Mobile!

    - by joe.huang
    Welcome to ADF Mobile team's weblog.  My name is Joe Huang - I am the product manager for ADF Mobile.  Oracle ADF Mobile is a part of Oracle's Application Development Framework (ADF) that support the development of enterprise/business applications that run on mobile devices.  The development tool for this framework is of course Oracle JDeveloper.  As some of you may know, we currently support the development of mobile browser-based application - this part of product is called ADF Mobile Browser.  Additionally, we are close to release a technology preview of ADF Mobile Client, which supports development of on-device, disconnect capable mobile applications.  What's truly unique about ADF Mobile development process is that it's a very visual and declarative experience, while still allow power Java developers to completely extend the framework to their liking.  The framework also provides a rich set of services needed by an enterprise-grade mobile application - these services would literally take years to implement if they are to be built from the ground up.  However, by using JDeveloper and ADF Mobile, you get the entire framework at your service!In the coming entries, the ADF Mobile product development team will publish any news, best practices, our observation on mobile technology trends, or just our experiences in playing with "gadgets".  Be sure to check back on this page!Sincerely,Joe HuangOracle

    Read the article

< Previous Page | 528 529 530 531 532 533 534 535 536 537 538 539  | Next Page >