Search Results

Search found 17054 results on 683 pages for 'jms request reply'.

Page 428/683 | < Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >

  • Modify HTML Content with Squid

    - by user38400
    We have setup our network as per the tutorial here: https://help.ubuntu.com/community/Upside-Down-TernetHowTo. Basically, we have a squid proxy that inverts images for pages that clients request. We're trying to modify the script so that we can edit the contents of the webpage before the webpage is sent to the client. We are not having any luck. I'm wondering if there is something different about .html files that makes this not possible. What is happening is that we do a wget on the URI that is requested, save it locally, modify it and then echo back the new URI. The page that the user gets is the unmodified page and not the one that we just changed.

    Read the article

  • Does this BSD-like license achieve what I want it to?

    - by Joseph Szymborski
    I was wondering if this license is: self defeating just a clone of an existing, better established license practical any more "corporate-friendly" than the GPL too vague/open ended and finally, if there is a better license that achieves a similar effect? I wanted a license that would (in simple terms) be as flexible/simple as the "Simplified BSD" license (which is essentially the MIT license) allow anyone to make modifications as long as I'm attributed require that I get a notification that such a derived work exists require that I have access to the source code and be given license to use the code not oblige the author of the derivative work to have to release the source code to the general public not oblige the author of the derivative work to license the derivative work under a specific license Here is the proposed license, which is just the simplified BSD with a couple of additional clauses (all of which are bolded). Copyright (c) (year), (author) (email) All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. The copyright holder(s) must be notified of any redistributions of source code. The copyright holder(s) must be notified of any redistributions in binary form The copyright holder(s) must be granted access to the source code and/or the binary form of any redistribution upon the copyright holder's request. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

    Read the article

  • Cannot Login to SBS 2008

    - by Ryan Holt
    Hi All, I'm hoping someone has an answer for me... I installed a new Microsoft SBS 2008 server last week and everything appeared to be working normally. I went to reboot the server yesterday to finish the install for Microsoft Windows Installer 4.5 and upon reboot could no longer login to the server via either RDP or local console. The error message I get states that there are no logon servers available to service the logon request. I'm able to login to the server fine via Safe Mode with Networking but cannot login via a normal method. The server is currently at SP1. I attempted to install SP2 inside of safe mode after enabling the installation services via a registry edit but the install failed and rolled back after 2 or 3 hours. It appears that one of the services is not starting for some reason. I believe it's LSASS but can't actually login to see the active services during a normal boot. Does anyone have any suggestions?

    Read the article

  • Connection to mysql server in SYN_SENT

    - by Sunil
    We are facing a strange problem from last few days between our application server and database server(Mysql): connection to database server from application server hangs in SYN_SENT state and after that we are not able to make any connection to database server on mysql port(3306). When we checked the netstat output on database server its in SYN_RECV state. What I can figure out is mysql server is receiving the SYN request and responding also and its not reaching to the client hence SYN_RECV at server side and SYN_SENT at client side. I think SYN_SENT state should go after some time and because of this other db connection attempts to same server should not hang. Does anybody have any idea how can we resolve this issue? Out setup details : Application server: RHEL 5.4, kernel-release = 2.6.18-164.el5, x86_64 Database server: Mysql Version : 5.1.49 RHEL 5.4, kernel-release = 2.6.18-164.el5, x86_64

    Read the article

  • Cost effective way to provide static media content

    - by james
    I'd like to be able to deliver around 50MB of static content, either in about 30 individual files up to 10MB or grouped into 3 compressed files, around 5k to 20k times a day. Ideally I'd like to put some sort of very basic security around providing the data to ensure that a request is from the expected source, but if tossing the security for a big reduction in price is possible then it's an option. Does anyone have any suggestions other than what I've found: Google AppEngine is $0.12/GB & I believe has a file size limit of 10MB so I'd have to break the data up a bit. So a rough calculation would seem to be that this would cost me about $30 to $120 a day. Or I've seen something like what seems to be just public static content delivery with no type of logic capabilities like Usenet.nl at what I think calculates to about $0.025/GB which would cost me about $6 to $25 a day. Any idea if I'm going about these calculations right & if there might be a better option for just static content on a decently high volume delivery? Again some basic security would be great but if cost is greatly reduced without it then I'm up for that.

    Read the article

  • MDX needs a function or macro syntax

    - by Darren Gosbell
    I was having an interesting discussion with a few people about the impact of named sets on performance (the same discussion noted by Chris Webb here: http://cwebbbi.wordpress.com/2011/03/16/referencing-named-sets-in-calculations). And apparently the core of the performance issue comes down to the way named sets are materialized within the SSAS engine. Which lead me to the thought that what we really need is a syntax for declaring a non-materialized set or to take this even further a way of declaring an MDX expression as function or macro so that it can be re-used in multiple places. Because sometimes you do want the set materialised, such as when you use an ordered set for calculating rankings. But a lot of the time we just want to make our MDX modular and want to avoid having to repeat the same code over and over. I did some searches on connect and could not find any similar suggestions so I posted one here: https://connect.microsoft.com/SQLServer/feedback/details/651646/mdx-macro-or-function-syntax Although apparently I did not search quite hard enough as Chris Webb made a similar suggestion some time ago, although he also included a request for true MDX stored procedures (not the .Net style stored procs that we have at the moment): https://connect.microsoft.com/SQLServer/feedback/details/473694/create-parameterised-queries-and-functions-on-the-server Chris also pointed out this post that he did last year http://cwebbbi.wordpress.com/2010/09/13/iccube/ where he pointed out that the icCube product already has this sort of functionality. So if you think either or both of these suggestions is a good idea then I would encourage you to click on the links and vote for them.

    Read the article

  • 503 service unavailable when debugging PHP script in Zend Studio

    - by user25932
    I have a web server with apache 2.0 installed. It comes with Zend Server install pack. When I’m trying to debug my php files apache serves a blank page with 503 service unavailable. Of course slow server-side code is tying up Apache requests for far too long, but I need it to wait, until my debugging comes to end. When I call to the page from a browser it launches ZendStudio debugging my PHP script (request redirects Zend Debugger module). I debug through my script and if I finish debugging in 120 seconds, I normally return to the browser. When it takes more than 120 seconds the browser displays '503 service unavailable' and I can't return to page output. I have even forced 'max_execution_time = 300' 'max_input_time = 600' in php.ini and 'TimeOut = 500' in httpd.conf. No matter whether it is Opera, IE or Firefox. I spent two days googling it, no right answer until now.

    Read the article

  • iproute2 premptive route creation, i think....

    - by Bryan Hunt
    Firstly: I know could do this the easy way with SSH but I want to learn how to route. I want to route packets back through the same tun0 interface from which they came into my system. I can do it for single routes. This works: sudo ip route add 74.52.23.120 metric 2 via 10.8.0.1 But i'd have to add them manually for each request that came down the pipe I've taken the blue pill and followed the http://lartc.org/howto/lartc.netfilter.html: Netfilter & iproute - marking packets tutorial But it's oriented towards redirecting OUTGOING packets based upon markers What I want is for a packet that comes in via tun0 not to be dropped which is what's happening right now, running scappy or suchlike to receive packets it doesn't seem to be receiving anything. Watching in wireshark I see the initial SYN packets coming in on the tun0 interface but that's as far as it gets without a static route as shown above. Am I nuts?

    Read the article

  • High availability for Windows Service under Windows Server 2003

    - by empi
    Hi. I have a following situation: I need to deploy a windows service that listens for incoming request on tcp port (basically WCF service). I have a High Availability requirement - the service must be deployed on two servers and if the service stops (only the service, not the whole server) on one server, all the requests must be redirected to the second one. For me it looks like a basic failover scenario. How can I achieve this on Windows Server 2003? Should I use Microsoft Cluster Service or Network Load Balancing? The important part is that the process of swapping the servers should not concern the clients (the client must see only single address / single host or domain name). Thanks in advance for help.

    Read the article

  • Send Multiple InMemory Attachments Using FileUpload Controls

    - by bullpit
    I wanted to give users an ability to send multiple attachments from the web application. I did not want anything fancy, just a few FileUpload controls on the page and then send the email. So I dropped five FileUpload controls on the web page and created a function to send email with multiple attachments. Here’s the code: public static void SendMail(string fromAddress, string toAddress, string subject, string body, HttpFileCollection fileCollection)     {         // CREATE THE MailMessage OBJECT         MailMessage mail = new MailMessage();           // SET ADDRESSES         mail.From = new MailAddress(fromAddress);         mail.To.Add(toAddress);           // SET CONTENT         mail.Subject = subject;         mail.Body = body;         mail.IsBodyHtml = false;                        // ATTACH FILES FROM HttpFileCollection         for (int i = 0; i < fileCollection.Count; i++)         {             HttpPostedFile file = fileCollection[i];             if (file.ContentLength > 0)             {                 Attachment attachment = new Attachment(file.InputStream, Path.GetFileName(file.FileName));                 mail.Attachments.Add(attachment);             }         }           // SEND MESSAGE         SmtpClient smtp = new SmtpClient("127.0.0.1");         smtp.Send(mail);     } And here’s how you call the method: protected void uxSendMail_Click(object sender, EventArgs e)     {         HttpFileCollection fileCollection = Request.Files;         string fromAddress = "[email protected]";         string toAddress = "[email protected]";         string subject = "Multiple Mail Attachment Test";         string body = "Mail Attachments Included";         HelperClass.SendMail(fromAddress, toAddress, subject, body, fileCollection);            }

    Read the article

  • Kubuntu 12.04 - DNS Issues

    - by AndrewJesaitis
    Starting yesterday (6/11/12), I've been having many network problems. When requesting a page in chrome, the page hangs on "Sending request" and then will eventually timeout. I'm within a VPN that has it's own DNS server. I've tried to manually set my DNS through the Network-Manager applet and by editing /etc/network/interfaces. Having no luck I unlinked the resolv.conf file and dumped the contents of my old resolv.conf into it. Again having no luck, I deactivated the dnsmasq server in /etc/NetworkManager/NetworkManager.conf by commenting out the dns=dnsmasq. $ cat NetworkManager.conf [main] plugins=ifupdown,keyfile #dns=dnsmasq no-auto-default=D0:67:E5:EA:B6:6B, [ifupdown] managed=false $ nm-tool NetworkManager Tool State: connected (global) - Device: eth0 [Wired connection 1] ------------------------------------------- Type: Wired Driver: tg3 State: connected Default: yes HW Address: D0:67:E5:EA:B6:6B Capabilities: Carrier Detect: yes Speed: 1000 Mb/s Wired Properties Carrier: on IPv4 Settings: Address: 192.168.254.122 Prefix: 24 (255.255.255.0) Gateway: 192.168.254.2 DNS: 192.168.254.1 What is strange is that the network will work fine for a few minutes then start to timeout. A few minutes later it will work again. I'm unable to hit internal or external sites when it is timing out. When I $dig local sites, I receive no answer. I do receive an answer from google.com. At this point, I would usually blame the DNS Server, especially since when I change to Google's DNS server things work. But, I need to use our internal DNS to hit our internal sites. Nobody else is having issues and they are all using DHCP. This group includes one user who is using 11.04. At this point, I'm at a loss for what to do, so any help would be appreciated.

    Read the article

  • How to configure squid for retrieving (and caching) directly my static resources?

    - by fabien7474
    I have an Apache/Tomcat/Spring tc Server running on CentOS EC2 VM. I would like to install squid on the same machine as a proxy for retrieving (directly i.e. without forwarding the request to Apache/Tomcat) and caching static content ONLY identified by URIs : /images, /css or /js. Other URIs should be forwarded to the normal Web Server and not cached. Since I am a newbie, I didn't find from squid documentation how to configure squid for this desired behavior (and if it is even possible). Could you please help me and tell me how should I configure squid for this purpose? Thank you.

    Read the article

  • Chef bootstrap giving 401 unauthorized

    - by loddy1234
    I'm trying to bootstrap a new chef node by running: knife bootstrap <server ip> -x lewis -N gitlab --sudo But I get the following output: [Mon, 03 Sep 2012 14:45:17 +0000] INFO: *** Chef 10.12.0 *** [Mon, 03 Sep 2012 14:45:17 +0000] INFO: Client key /etc/chef/client.pem is not present - registering [Mon, 03 Sep 2012 14:45:17 +0000] INFO: HTTP Request Returned 401 Unauthorized: Failed to authenticate. Ensure that your client key is valid. [Mon, 03 Sep 2012 14:45:17 +0000] FATAL: Stacktrace dumped to /var/chef/cache/chef-stacktrace.out [Mon, 03 Sep 2012 14:45:17 +0000] FATAL: Net::HTTPServerException: 401 "Unauthorized" My chef server is running Ubuntu 12.04 x32 and the machine I'm trying to bootstrap is running CentOS 6.3 x64 Any idea what's going wrong?

    Read the article

  • How to achieve a loosely coupled REST API but with a defined and well understood contract?

    - by BestPractices
    I am new to REST and am struggling to understand how one would properly design a REST system to both allow for loose coupling but at the same time allow a consumer of a REST API to understand the API. If, in my client code, I issue a GET request for a resource and get back XML, how do I know what to do with that xml? e.g. if it contains <fname>John</fname><lname>Smith</lname> how do I know that these refer to the concept of "first name", "last name"? Is it up to the person writing the REST API to define in documentation some place what each of the XML fields mean? What if producer of the API wants to change the implementation to be <firstname> instead of <fname>? How do they do this and notify their consumers that this change occurred? Or do the consumers just encounter the error and then look at the payload and figure out on their own that it changed? I've read in REST in Practice that using a WADL tool to create a client implementation based on the WADL (and hide the fact that you're doing a distributed call) is an "anti-pattern". But I was planning to do this-- at least then I would have a statically typed API call that, if it changed, I would know at compile time and not at run time. Why is this a bad thing to generate client code based on a WADL? And how do I know what to do with the links that returned in the response of a POST to a REST API? What defines this contract and gives true meaning to what each link will do? Please help! I dont understand how to go from statically-typed or even SOAP/RPC to REST!

    Read the article

  • Frequent disconnects with oneiric server using wlan AR9285

    - by John Neil
    I'm getting a large number of disconnects from my wireless when I switched to oneiric server (I did not see these happen with oneiric desktop) from my AR9285 wireless LAN device. Here is the syslog snippet: Oct 17 09:43:17 weather kernel: [ 1537.329138] wlan0: deauthenticated from 00:12:17:7a:8e:42 (Reason: 7) Oct 17 09:43:17 weather kernel: [ 1537.340409] cfg80211: All devices are disconnected, going to restore regulatory settings Oct 17 09:43:17 weather kernel: [ 1537.340423] cfg80211: Restoring regulatory settings Oct 17 09:43:17 weather kernel: [ 1537.340435] cfg80211: Calling CRDA to update world regulatory domain Oct 17 09:43:17 weather kernel: [ 1537.348571] cfg80211: Ignoring regulatory request Set by core since the driver uses its own custom regulatory domain Oct 17 09:43:17 weather kernel: [ 1537.348581] cfg80211: World regulatory domain updated: Oct 17 09:43:17 weather kernel: [ 1537.348586] cfg80211: (start_freq - end_freq @ bandwidth), (max_antenna_gain, max_eirp) Oct 17 09:43:17 weather kernel: [ 1537.348594] cfg80211: (2402000 KHz - 2472000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348600] cfg80211: (2457000 KHz - 2482000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348607] cfg80211: (2474000 KHz - 2494000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348613] cfg80211: (5170000 KHz - 5250000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348620] cfg80211: (5735000 KHz - 5835000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Here is the relevant lspci output: # lspci | grep Atheros 02:00.0 Network controller: Atheros Communications Inc. AR9285 Wireless Network Adapter (PCI-Express) (rev 01) I have done quite a bit of searching and saw discussions for previous versions of ubuntu that recommended installing the linux-backports-modules package. However, this does not appear to be available for oneiric (just the headers are listed as a package). Any advice on how to achieve a stable wireless connection for this server? It's location mitigates against using a wired connection.

    Read the article

  • How to configure Dovecot to not serve large emails to high-latency clients?

    - by Daniel Quinn
    I have a Dovecot mailserver running at home on a flaky cable connection. For the most part, the IMAP functionality works beautifully, but I'd like to add one feature if I can: I want Dovecot not to serve large messages to high-latency clients. That is to say, if someone decides that it's a good idea to send me a 9.3mb email to me, I don't want to get it unless I'm on my LAN at home. This can't be an uncommon request, but I'm having trouble finding the configuration option in their documentation. Any ideas and/or good keywords to use in Googling would be awesome.

    Read the article

  • Allow access to WordPress site only by links in email newsletters

    - by Shane
    I send out a personal email newsletter, and have been looking into sending it via some service like MailChimp, or sendy.co. Many of these email services suggest, or require, the email newsletter content to be available online, in case the recipient's email app doesn't render it properly, or at all. The thing is I don't want my newsletter contents visible to the whole world. Nor do I want to require existing recipients to make accounts/be assigned accounts, with passwords. So, the question is: How can my WordPress site content be viewable only by clicking on the link to it in the email newsletter. It can't be found in a Google search; but once at the site the visitor can view previous newsletter contents. It seems an .htaccess file would do the trick, but I have been unable to figure out the syntax for this. Thanks for your help. I have copied below two other questions, and answers, which have helped me word my question clearly. Similar to this request about allowing access to a certain group while still restricting access to the world: Is there a way to password protect directory only in cpanel. But the user should not be prompted the password, when they try to access it via web? This persons question is the closest I could find to my situation: Restrict direct folder access via .htaccess except via specific links

    Read the article

  • NLB RPC server is unavailable on the specified computer

    - by Robin Weston
    Hi guys, Firstly, I'll admit that my networking knowledge is limited so as people request more information I'll update this question accordingly. I am trying to create a NLB Cluster across 2 Windows Server 2008 Web Servers. Neither of the machines are members of a domain, and both have 2 NICs (one for processing external web traffic, and one for communicating internally). I have installed NLB on both machines, and have created a cluster on Host A and added itself to it. However, when I try and add Host B (using the address from the external NIC) I get the following error : "The RPC server is unavailable on the specified computer". On Host B I can see that the RPC service is running fine. I can also ping and RDP from Host A to Host B with no problems either. I have disabled the windows firewall on both machines but that had no effect

    Read the article

  • Who are the SOA experts? Specialization recognized by customers

    - by Jürgen Kress
    You are looking for the SOA experts to deliver an successful project - contact our Oracle SOA Specialized partners - you can recognize them by the logo, the plaques and in the solutions catalog: Plaques SOA Specialized We would like to offer you a nice SOA Specialization plaque  with your logo to proof your success. If you are a SOA Specialized partner and would like to request the plaque please send Brigitte an e-mail with the following information: Partner Name Partner logo (preferred eps file) Partner Status gold or platinum We recommend to mount the plaque at your office reception in addition you can use the SOA Specialization logos at your website Download Logo: Gold & Platinum Solutions Catalog Please make sure that your Oracle Partner Network administrator will add your achieved Specializations to the Oracle Solutions catalog We started to promote at our website www.oracle.com/soa the find a Specialized Partner who added their Service Oriented Architecture Specialization in the solutions catalog. For administration please visit manage solutions catalog within OPN For detailed tutorial and an faq please visit. http://tinyurl.com/Catalogorcl   For more information on SOA Specialization and special SOA please make sure that you read the SOA & Application Grid Specialization Guide and the SOA & Application Grid Specialization Checklist. Blog Twitter LinkedIn Mix Forum Wiki Website Technorati Tags: SOA Sepecialization,OPN,Oracle,SOA,Jürgen Kress,plaques,solutions catalog

    Read the article

  • EBS Customer Relationship Manager (CRM) Product Family Webcasts

    - by user793044
    Oracle's Advisor Webcasts are live presentations given by subject matter experts who deliver knowledge and information about services, products, technologies, best practices and more. Delivered through WebEx the Oracle Advisor Webcast Program brings interactive expertise straight to your desktop, at no cost. Each session is usually followed by a live Q&A where you can have your questions answered. If you miss any of the live webcasts then you can replay the recording or download the PDF of the presentation. Doc Id 740966.1 gives you access to all the scheduled webcasts as well as the archived recordings and presentations. Just select the product family you are interested in to access the latest webcasts in that area. Below is a listing of the currently scheduled archived webcasts for the EBS CRM and Industries product family. Webcast Topic and Description Webcast Link Date and Time Upcoming: Oracle E-Business Suite - Service Oracle Service Charges - Introduction/Overview Register Dec 6, 2012 EBS CRM - Service R12: How to debug Email Center Auto Service Request Creation Failures Recording | .pdf Archived XCALC: Failed Calculations when Using OIC Recording | .pdf Archived XPOP: Failed Population When Using Oracle Incentive August 30, 2012 Recording | .pdf Archived XROLL: Failed Roll Up When Using Oracle Incentive Compensation August 16, 2012 Recording | .pdf Archived Common Problems Associated with Product Catalog in Sales Recording | .pdf Archived Oracle Incentive Compensation - Troubleshooting Payment Issues Recording | .pdf Archived R12 Renewing Service Contracts - Overview Recording | .pdf Archived 11i and R12 Oracle CRM Service Basics and Troubleshooting - an Overview Recording | .pdf Archived 11i and R12 Transaction Error Troubleshooting Overview Recording | .pdf Archived

    Read the article

  • Difference between two kinds of Bing URL Referers

    - by joshuahedlund
    Most of the referral URLS that I get from Bing have the following syntax: http://www.bing.com/search?q=keywords+keywords&[some other variables] However I just noticed that maybe 10-20% of them are coming in like this: http://www.bing.com/url?source=search&[some other variables]&url=http%3A%2F%2Fwww.example.com/user-landing-page-on-my-site&yrktarget=_top&q=keywords+keywords&[some other variables] The first syntax gives me the keywords the user typed in, but the second actually gives me the keywords the user typed in and their landing page on my site. I was originally unaware of this second kind altogether because I have a customized referral report that filters out URLs containing my domain. But now that I noticed them I want to know why they occur to see if I can get more to occur this way because the second syntax contains more valuable information. If I go to one of the first URLs, it gives me a typical Bing query page. The second URLs seem to just redirect me to the Bing home page. I'm not sure if it has to do with the kind of search being performed (I also get a few http://www.bing.com/shopping/search?q= referers) or some other metric. Does anyone know what causes some referral URLs from Bing to have the /search?q syntax and others to have the /url?source syntax? P.S. I have verified that I am getting both kinds of URLs from non-advertising clicks. P.P.S. I am not talking about data in Google Analytics or similar software but the raw $_SERVER['HTTP_REFERER'] value coming from the client's original request.

    Read the article

  • Monitoring/logging a malfunctioning internet connection

    - by Pekka
    I have a mysterious internet connection problem: Every 15-20 minutes, the connection will become very slow, and take 2-3 minutes for anything to load. I've had a technician from the ISP over here to test the hardware, and everything is in pristine condition. They have no other explanation than a configuration error on my machine, a possibility I can exclude 90% because I'm experiencing the same problems with another machine. I will have to monitor the situation now, and I would like to run a program that logs when internet connections become slow. I thought about putting something together using at and wget. Does anybody know of some other tool for this that does this out of the box - maybe something with an adjustable request frequency, logging connection speeds etc.?

    Read the article

  • Do CDNs actually dump the data to the client or pass the URL of the content?

    - by zengr
    I am curious, say for example: Facebook is the client of Akamai CDN. So, now when I login to my FaceBook page, I see all the content (vid, image, text etc) and I click on a video to view it. Now, Facebook is the client to Akami to get the content. So, when a request is made from Facebook to Akami, does Akamai dump the vid/image from their data centers to Facebook's data centers where they reside for a while (depending on their heuristics) and flushed after some time? Or, I see that video (stream it) directly from Akamai's server? UPDATE Data resides in CDN permanently (agreed), but is a copy of the content sent to Facebook's data centers too

    Read the article

  • How Service Component Architecture (SCA) Can Be Incorporated Into Existing Enterprise Systems

    After viewing Rob High’s presentation “The SOA Component Model” hosted on InfoQ.com, I can foresee how Service Component Architecture (SCA) can be incorporated in to an existing enterprise. According to IBM’s DeveloperWorks website, SCA is a set of conditions which outline a model for constructing applications/systems using a Service-Oriented Architecture (SOA). In addition, SCA builds on open standards such as Web services. In the future, I can easily see how some large IT shops could potently divide development teams or work groups up into Component/Data Object Groups, and Standard Development Groups. The Component/Data Object Group would only work on creating and maintaining components that are reused throughout the entire enterprise. The Standard Development Group would work on new and existing projects that incorporate the use of various components to accomplish various business tasks. In my opinion the incorporation of SCA in to any IT department will initially slow down the number of new features developed due to the time needed to create the new and loosely-coupled components. However once a company becomes more mature in its SCA process then the number of program features developed will greatly increase. I feel this is due to the fact that the loosely-coupled components needed in order to add the new features will already be built and ready to incorporate into any new development feature request. References: BEA Systems, Cape Clear Software, IBM, Interface21, IONA Technologies PLC, Oracle, Primeton Technologies Ltd, Progress Software, Red Hat Inc., Rogue Wave Software, SAP AG, Siebel Systems, Software AG, Sun Microsystems, Sybase, TIBCO Software Inc. (2006). Service Component Architecture. Retrieved 11 27, 2011, from DeveloperWorks: http://www.ibm.com/developerworks/library/specification/ws-sca/ High, R. (2007). The SOA Component Model. Retrieved 11 26, 2011, from InfoQ: http://www.infoq.com/presentations/rob-high-sca-sdo-soa-programming-model

    Read the article

  • GNS3 Cannot ping/resolve DNS record

    - by Eldad Cohen
    I set up an internet lab with GNS3, which has 3 routers, in each node there is a computer directly connected. One of the hosts is a DNS server, Windows 2003 Server. The other one is a Windows XP machine. Ping is good between routers and machines but no ability to ping domain.com record on DNS server 2003. I set a static nat on the router to route all traffic from gateway to the DNS server internal ip address, still no answer for the dns request. Any ideas or thoughts will be most welcome.

    Read the article

< Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >