Search Results

Search found 717 results on 29 pages for 'justin sterling'.

Page 13/29 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • Active Directory replication failing with Access is Denied

    - by Justin Love
    I recently discovered that Active Directory replication started failing about a month ago. If I attempt to Replicate Now from the failing domain controller, I receive The following error occurred during the attempt to synchronize the domain controllers: Access is denied. It is between two servers at a remote site. One is Windows 2003 and the other is Windows 2000; the Windows 2000 machines is experiencing the errors. The domain is older OUR_DOMAIN style. Attempts so far: I disabled Kerberos service on the Windows 2000 server and restarted RPC and RPC locater services have expected settings HKEY_Local_Machine\Software\Microsoft\Rpc\ClientProtocols missing ncacn_nb_tcp on Windows 20003 server (added) Portqry reports okay Firewall disabled netdom resetpwd (and reboot) on Windows 2000 server.

    Read the article

  • Community Event Leader Tools

    - by GavinPayneUK
    As many of you know, I run a small SQL Server community event here in the UK, SQL Server in the Evening, with the help of Coeo colleague and MVP Justin Langford.  There’s been half a dozen evening events in the last 16 months and recently it got to the point where I needed to start putting proper tools in place to communicate with my event’s followers.  As well as telling them when the next event is it’d also be nice to share some of the Chapter Leader mails I get from Pass etc.  ...(read more)

    Read the article

  • Which AMI to to use for Java/Tomcat/MySQL in Amazon EC2?

    - by Justin
    I originally posted this on stackoverflow.com and it was suggested serverfault.com might be a better place to ask this question. So here goes: I'm trying to determine which Amazon Machine Image (AMI) to use as my Virtual Server in Amazon's EC2. For now, I'll need to choose an AMI that complies with the AWS Free Usage Tier. I want to deploy a Java app that I've been developing using Eclipse on Windows XP, Tomcat 7 and MySQL 5.5. I'm aware that I can choose the Basic 32-bit Amazon Linux AMI. Then I'd manually install Tomcat and MySQL (does MySQL get installed on the image or separately on an Elastic Block Store (EBS)?). Here's the rub, I'm a bit of a Linux noob. I can start Tomcat and tail the logs and such on Linux but I'm not familiar with the install process for Tomcat and MySQL on Linux and commands like sudo and chmod. I'm happy to get more hands on with Linux but I'm short on time right now. Are there AMI's that already have Tomcat and MySQL bundled? The Request Instance Wizard shows 805 Community AMI's that are Free Tier Eligible. 51 of the Free Tier Eligible AMI's have "Tomcat" in their name. I'm willing to consider using Elastic Beanstalk but my research thus far hasn't found any discussion of using MySQL with Beanstalk. The discussions all seem to use Amazon's SimpleDB. Any advice is greatly appreciated.

    Read the article

  • How do I set the UNC permissions on a SQL Server 2008 Filestream UNC?

    - by Justin Dearing
    I have a SQL Server 2008 instance. I have configured filestream access properly, and use it from one column on one table in one of my databases. However, I cannot access the UNC share for the filestream data. I have tried browsing to it as well as trying to open specific files and I get errors both ways. I am running SQL Server 2008 enterprise on a Windows 7 workstation running on the domain. I've tried running the sql server service as a local user, then as network admin. The user I am logged in as is a local admin and a sysadmin in SQL server.

    Read the article

  • Benefit of running redis as a daemon

    - by Justin Meltzer
    I'm trying to understand what the benefit of running redis as a daemon is. The redis default configuration seems not to run redis as a daemon, but locally on Mac OS X I added it to LaunchAgents, so I'm guessing it is running as a daemon anyway? Also on my remote application which is running on a linux server, since it won't have LaunchAgents (as far as I'm aware) will I have to run redis as a daemon? What will be the benefit of doing so?

    Read the article

  • How can I scan from Canon PIXMA MX700 without disabling OS X firewall?

    - by Justin Love
    I have a Canon PIXMA MX700 connected by ethernet. When I first bought it I was using OS X 10.4, and scanner-initiated scanning worked fine. After upgrading to 10.6, neither scanner-initiated or scanning from MP Navigator EX works with the firewall enabled. The firewall lists exceptions for three applications: Canon IJ Network Scan Utility.app Canon IJ Network Scanner Selector.app MP Navigator EX 1.0.app I get no further blocked warnings, and /var/log/appfirewall.log lists nothing for today (my latest attempt to use it).

    Read the article

  • Why is Varnish not caching?

    - by Justin
    I am troubleshooting the setup of Varnish 3.x on my Ubuntu server. I'm running Drupal 7 on two sites set up on the box, via named-based vhosts. Before trying to get Varnish to play nice with Drupal I'm trying to just get Varnish to a PNG from cache. Here are the headers I get from a curl -I request of the PNG file: HTTP/1.1 200 OK Server: Apache/2.2.22 (Ubuntu) Last-Modified: Sun, 07 Oct 2012 21:18:59 GMT ETag: "a57c2-3850-4cb7ea73db6c0" Accept-Ranges: bytes Content-Length: 14416 Cache-Control: max-age=1209600 Expires: Thu, 25 Oct 2012 22:55:14 GMT Content-Type: image/png Accept-Ranges: bytes Date: Thu, 11 Oct 2012 22:55:14 GMT X-Varnish: 1766703058 Age: 0 Via: 1.1 varnish Connection: keep-alive X-Varnish-Cache: MISS Here is the Varnish VCL file I'm using (It's a default VCL configuration designed for Drupal): # Default backend definition. Set this to point to your content # server. # backend default { .host = "127.0.0.1"; .port = "8080"; } # Respond to incoming requests. sub vcl_recv { # Use anonymous, cached pages if all backends are down. if (!req.backend.healthy) { unset req.http.Cookie; } # Allow the backend to serve up stale content if it is responding slowly. set req.grace = 6h; # Pipe these paths directly to Apache for streaming. #if (req.url ~ "^/admin/content/backup_migrate/export") { # return (pipe); #} # Do not cache these paths. if (req.url ~ "^/status\.php$" || req.url ~ "^/update\.php$" || req.url ~ "^/admin$" || req.url ~ "^/admin/.*$" || req.url ~ "^/flag/.*$" || req.url ~ "^.*/ajax/.*$" || req.url ~ "^.*/ahah/.*$") { return (pass); } # Do not allow outside access to cron.php or install.php. #if (req.url ~ "^/(cron|install)\.php$" && !client.ip ~ internal) { # Have Varnish throw the error directly. # error 404 "Page not found."; # Use a custom error page that you've defined in Drupal at the path "404". # set req.url = "/404"; #} # Always cache the following file types for all users. This list of extensions # appears twice, once here and again in vcl_fetch so make sure you edit both # and keep them equal. if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset req.http.Cookie; } # Remove all cookies that Drupal doesn't need to know about. We explicitly # list the ones that Drupal does need, the SESS and NO_CACHE. If, after # running this code we find that either of these two cookies remains, we # will pass as the page cannot be cached. if (req.http.Cookie) { # 1. Append a semi-colon to the front of the cookie string. # 2. Remove all spaces that appear after semi-colons. # 3. Match the cookies we want to keep, adding the space we removed # previously back. (\1) is first matching group in the regsuball. # 4. Remove all other cookies, identifying them by the fact that they have # no space after the preceding semi-colon. # 5. Remove all spaces and semi-colons from the beginning and end of the # cookie string. set req.http.Cookie = ";" + req.http.Cookie; set req.http.Cookie = regsuball(req.http.Cookie, "; +", ";"); set req.http.Cookie = regsuball(req.http.Cookie, ";(SESS[a-z0-9]+|SSESS[a-z0-9]+|NO_CACHE)=", "; \1="); set req.http.Cookie = regsuball(req.http.Cookie, ";[^ ][^;]*", ""); set req.http.Cookie = regsuball(req.http.Cookie, "^[; ]+|[; ]+$", ""); if (req.http.Cookie == "") { # If there are no remaining cookies, remove the cookie header. If there # aren't any cookie headers, Varnish's default behavior will be to cache # the page. unset req.http.Cookie; } else { # If there is any cookies left (a session or NO_CACHE cookie), do not # cache the page. Pass it on to Apache directly. return (pass); } } } # Set a header to track a cache HIT/MISS. sub vcl_deliver { if (obj.hits > 0) { set resp.http.X-Varnish-Cache = "HIT"; } else { set resp.http.X-Varnish-Cache = "MISS"; } } # Code determining what to do when serving items from the Apache servers. # beresp == Back-end response from the web server. sub vcl_fetch { # We need this to cache 404s, 301s, 500s. Otherwise, depending on backend but # definitely in Drupal's case these responses are not cacheable by default. if (beresp.status == 404 || beresp.status == 301 || beresp.status == 500) { set beresp.ttl = 10m; } # Don't allow static files to set cookies. # (?i) denotes case insensitive in PCRE (perl compatible regular expressions). # This list of extensions appears twice, once here and again in vcl_recv so # make sure you edit both and keep them equal. if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset beresp.http.set-cookie; } # Allow items to be stale if needed. set beresp.grace = 6h; } # In the event of an error, show friendlier messages. sub vcl_error { # Redirect to some other URL in the case of a homepage failure. #if (req.url ~ "^/?$") { # set obj.status = 302; # set obj.http.Location = "http://backup.example.com/"; #} # Otherwise redirect to the homepage, which will likely be in the cache. set obj.http.Content-Type = "text/html; charset=utf-8"; synthetic {" <html> <head> <title>Page Unavailable</title> <style> body { background: #303030; text-align: center; color: white; } #page { border: 1px solid #CCC; width: 500px; margin: 100px auto 0; padding: 30px; background: #323232; } a, a:link, a:visited { color: #CCC; } .error { color: #222; } </style> </head> <body onload="setTimeout(function() { window.location = '/' }, 5000)"> <div id="page"> <h1 class="title">Page Unavailable</h1> <p>The page you requested is temporarily unavailable.</p> <p>We're redirecting you to the <a href="/">homepage</a> in 5 seconds.</p> <div class="error">(Error "} + obj.status + " " + obj.response + {")</div> </div> </body> </html> "}; return (deliver); } I'm getting a MISS and age 0 every time. If I'm understanding correctly, this means the file isn't being returned from Varnish's cache. Is there a problem with my Varnish config?

    Read the article

  • TechCast Live: Java and Oracle, One Year Later (tomorrow!)

    - by Jacob Lehrbaum
    On year ago, tomorrow, Oracle became the steward of Java through its acquisition of Sun.We invite you to join us tomorrow on the anniversary of this memorable event for a special TechCast Live conversation with Ajay Patel, VP of Product development for Application Grid Products and Justin Kestelyn of Oracle Technology Network. Topics that will be covered include:- Highlights, challenges and what we learned over the past year - The Future of Java and its importance to Oracle and the community - Oracle's Application Grid product portfolio todayDate:Feb, 15, 10:00am PSTWatch it live (tomorrow)

    Read the article

  • Trouble opening my router to my web server

    - by Justin Heather Barrios
    Here's the story. I have a webs server created and connected to my router. The website works great when I'm connected to the router, but when I'm off the network I can't access the website. I got the IP for my router by googling "what is my ip." I have opened ports 80 to 10080 to link to the server in the router. One odd thing that I don't understand. When I am in network if I access XXX.XXX.XX.XX:80 I can access the web page no problem. If I access XXX.XXX.XX.XX:81 (or any other port) I get the error "Cannot access server." Any idea what the problem could be? Could it be my ISP?

    Read the article

  • What is the Best Internet Provider in the Salt Lake Valley for hosting your future online business f

    - by Justin
    This is for people familiar with the ISP scene in Salt Lake. Also, UTOPIA is not available in my neighborhood yet. I'm looking for comparisons between Comcast, Qwest, and especially other providers I'm not aware of. While I will have online backup (of course!), I want to host some things from my own home at the start of my business. Once money starts flowing in, I will move to a hosted provider, but in the meantime I would like a provider which provides fast (1+ mb/s at least) upload speeds (fast download a given), a static IP, and especially a reasonable price.

    Read the article

  • Is there a way to disable safe_mode in a client or domain template in Plesk?

    - by Justin
    Is there a way to disable the PHP setting safe_mode in a client or domain template in Plesk 8.6? Or, to disable it for new domain creation in some other Plesk setting/configuration? I've Googled, and people seem to think no I've asked our hosting provider, and they don't think so I've looked in both the domain and client templates in Plesk and don't see it Has anyone been able to do this Plesk 8.6 for new domains?

    Read the article

  • Netgear CG3000D new Modem/Router - Random High Ping

    - by justin.chmura
    Cox just recently came out and looked at my internet and decided that the modem I had was causing high latency issue. The speed was fine but the ping would spike to around 100 and over when gaming or putting a load more than browsing on the line. After they replaced it, it seems like I get better latency, but when it spikes, I get upwards of over 300 ping with like 500 jitter. I figured I would hit the serverfault universe before sending another email to Cox. I opted not to do the Cox setup as it was an extra $20 which I thought would have just setup the wireless (which I can handle). Is it a setting or something that I missed that needs to be setup? The firmware for the CG3000D is awful and not fun to use. I did change some hidden settings on the RgServices.asp page (I'll attach a screenshot). I've also heard that the Router/Modem combos are awful and that I should go back and just ask for a modem stand-alone. Any input is helpful. All screenshots: http://imgur.com/a/JX6qu#0

    Read the article

  • Hourly CRON task running more frequently than one hour

    - by Justin
    I have a cron task that calls a special PHP script via wget. Here is the crontab entry: 0 * * * * wget http://www.... It will work perfect for several days, running on the hour. However, after a few days the cron job will start to be called several times an hour. I have never seen CRON drift like this, so I imagine it can't really be a CRON issue. However, the logs of the script that is called clearly show it running several times an hour. Server details: Ubuntu Luci Apache MySQL PHP5 Time is showing correct @ command line Server is setup to sync with a NTP server In order for the script to run it must be passed a unique 50-character hash key in the URL, so this script isn't being called from any other source accidentally. What might cause CRON to drift like this?

    Read the article

  • shut down FTP from IIS 6 after <X> failed login attempts

    - by Justin C
    Is there a setting in IIS 6 to turn an FTP site off after a specified number of failed login attempts? It has already been documented on this site that a Windows server sitting on a static IP address can record tens of thousands of failed login attempts a month. One server I maintain has had tens of thousands of attempts made against the FTP port. I have solid passwords in place, so I am not overly concerned. I rarely have to use the FTP, so for the most part I turn it on and off as I need it. Sometimes though I forget to turn it off when I am done, only to find the next day that my EventLog is full of audit failures. I would want to set a high number, in case I just messed up the password. Something like if 50 failed login attempts happen, just turn off the FTP site. Then if I need it later I can just start it again.

    Read the article

  • useful JMX metrics for monitoring WebSphere Application Server (and apps inside it)?

    - by Justin Grant
    When managing custom Java applications hosted inside WebSphere Application Server, what JMX metrics do you find most useful for monitoring performance, monitoring availability, and troubleshooting problems? And how do you prefer to slice and visualize those metrics (e.g. chart by top 10 hosts, graph by app, etc.). The more details I can get, the better, as I need to specify a standard set of reports which IT can offer to owners of applications hosted by IT, which those owners can customize but many won't bother. So I'll need to come up with a bunch of generally-applicable reports which most groups can use out-of-the-box. Obviously there's no one perfect answer to this question, so I'll accept the answer with the most comprehensive details and I'll be generous about upvoting any other useful answer. My question is WebSphere-specific, but I realize that most JMX metrics are equally applicable across any container, so feel free to give an answer for JBoss, Tomcat, WebLogic, etc.

    Read the article

  • Copying partial cell to another cell in OpenOffice Calc

    - by Justin
    Cell A1 says 0001 John Smith Cell A2 says 0002 Bill Snyder I want to basically split this, so one column just shows the numbers (0001, 0002, etc.) and then another column just shows the name. The first part is easy. Using the function "=LEFT(A1;4)" I can get 0001. How can I grab the name? Using "RIGHT(A1;99)", for example, will grab the entire string "0001 John Smith". Since each name is different in length, I'm not sure what to do. Can I somehow tell it to grab the whole string EXCEPT the first 4 characters? Or somehow tell it to grab the last 2 WORDS instead of a number of characters like it's asking?

    Read the article

  • Is there a local yubnub.org replacement?

    - by Justin Keogh
    I use yubnub very often... every google search I do by just (in firefox) "ctrl-t" - (now in the url bar) "y g searchterms" [Enter] "y" in this case is a search keyword I added by right clicking in the yubnub.org command box it's really fast, and I just do it automatically now... but the problem is now I am stuck with whatever the yubnub command that I am so used to using does. I cant change it... for example, what if I dont want to use google... but I still want to use the "g" command to search? or say I want to use google's https search... ect... I suppose this would be kinda trivial to implement locally... but I would hate to re-invent the code if it's allready done and in use... ideas? Also a local yubnub.org replacement would save me the DNS lookup and traffic to yubnub.org. I dont expect to be able to import all commands from yubnub.org but that would be cool if possible.

    Read the article

  • PHP ssh2_fingerprint() does not match ssh-keygen -lf id_rsa.pub

    - by Justin
    I am using the lib ssh2 module with PHP and calling the function ssh2_fingerprint() to get the keys fingerprint. According to all resources on the internet, I can get the fingerprint of a public key by executing: ssh-keygen -lf id_rsa.pub Which outputs something like: 2048 d4:41:3b:45:00:49:4e:fc:2c:9d:3a:f7:e6:6e:bf:e7 id_rsa.pub (RSA) However, when I call ssh2_fingerprint($connection, SSH2_FINGERPRINT_HEX) in PHP with the same public key I get: dddddba52352e5ab95711c10fdd56f43 Shouldn't they match? What am I missing?

    Read the article

  • Splitting Multiple Files in Windows

    - by Justin Boucher
    We have a 21TB LUN full of images that are approx 600K in size in multiple sub folders on the disk. We are trying to split the 21TB LUN into 8 smaller LUNs that are about 2.6TB a piece in order to process the images more effectively. My question is how we can determine what 2.6TB is on the drive? What is the best tool to mark this data so we can copy it to the new smaller LUNs with robocopy or emcopy without overfilling the smaller LUNs? Is there a third-party tool that would be better suited for this task? Thank you in advance for your assistance.

    Read the article

  • Alternative routers to the Cisco SA 500

    - by Justin
    We are evaluating the Cisco SA 500 router for our new office router. Would anyone recommend another similarly-featured router from another manufacturer? Requirements: - Office of 14 people - We are likely to switch to 14 VOIP phones (Linksys SPA-942) soon - We want to use VPN on the router, if possible, with Windows and Mac users

    Read the article

  • Wireless WAN (WWAN) on a Lenovo T500 - built-in or do I need a WWAN modem?

    - by Justin Grant
    I use a Lenovo ThinkPad 2055-3AU at work and I want to get a Wireless WAN data plan with a local mobile telecom provider. I've read conflicting reports online about whether my system is "WWAN-ready" or not. How can I find out which wireless WAN providers (if any) my system can support without buying a separate modem? I looked through Device Manager for anything resembling a WWAN device and didn't see anything, but I also wiped the machine when I bought it and clean-installed Windows 7 with only out-of-the-box Windows and Windows-Update drivers, so it's possible that the device is there but the drivers aren't installed. FWIW, the support page at http://www-307.ibm.com/pc/support/site.wss/quickPath.do?quickPathEntry=20553AU does not specfically list anything about Wireless WAN.

    Read the article

  • Nginx Multiple If Statements Cause Memory Usage to Jump

    - by Justin Kulesza
    We need to block a large number of requests by IP address with nginx. The requests are proxied by a CDN, and so we cannot block with the actual client IP address (it would be the IP address of the CDN, not the actual client). So, we have $http_x_forwarded_for which contains the IP which we need to block for a given request. Similarly, we cannot use IP tables, as blocking the IP address of the proxied client will have no effect. We need to use nginx to block the requested based on the value of $http_x_forwarded_for. Initially, we tried multiple, simple if statements: http://pastie.org/5110910 However, this caused our nginx memory usage to jump considerably. We went from somewhere around a 40MB resident size to over a 200MB resident size. If we changed things up, and created one large regex that matched the necessary IP addresses, memory usage was fairly normal: http://pastie.org/5110923 Keep in mind that we're trying to block many more than 3 or 4 IP addresses... more like 50 to 100, which may be included in several (20+) nginx server configuration blocks. Thoughts? Suggestions? I'm interested both in why memory usage would spike so greatly using multiple if blocks, and also if there are any better ways to achieve our goal.

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >