Search Results

Search found 39224 results on 1569 pages for 'content delivery network'.

Page 94/1569 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • How to Stream Videos and Music Over the Network Using VLC

    - by Chris Hoffman
    VLC includes a fairly easy-to-use streaming feature that can stream music and videos over a local network or the Internet. You can tune into the stream using VLC or other media players. Use VLC’s web interface as a remote control to control the stream from elsewhere. Bear in mind that you may not have the bandwidth to stream high-definition videos over the Internet, though. How to Use an Xbox 360 Controller On Your Windows PC Download the Official How-To Geek Trivia App for Windows 8 How to Banish Duplicate Photos with VisiPic

    Read the article

  • Wired Network (eth0) disappears on a Lenovo Z570

    - by theGeek
    I upgraded from Ubuntu 11.10 to 12.04 yesterday. I am running a BSNL Broadband DSL conenction. The wired network suddenly dissapears and the computer stops detecting the ethernet card after sometime. However, after rebooting, it again becomes alright. Specs : Lenovo Ideapad Z570 Intel i5 2nd Gen. processor 3 GB RAM $ lspci | grep Ethernet 03:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8101E/RTL8102E PCI Express Fast Ethernet controller (rev 05)

    Read the article

  • 3 Common Social Media Network ';Threats'; to Avoid

    Online social media network sites have experienced an extraordinary growth in popularity over the last few years. The growth in social media use has captured the attention of not only enthusiastic us... [Author: TJ Philpott - Computers and Internet - May 25, 2010]

    Read the article

  • Build Your Personal Network

    - by AllenMWhite
    Recently a few people have approached me privately about their careers, and how they can make the changes to allow them to do the kind of work they'd like to do, be it consulting or in a full-time role. (In every case, I was flattered and surprised, as I never felt I had that much insight into career choices.) The most important thing, I told each of them, was to use the network of people you know. You will always be more successful finding opportunities through personal contacts than you will through...(read more)

    Read the article

  • Fail to access Network options

    - by Konstantinos Marinis
    I am trying to use OpenDNS for my newly installed Ubuntu 12.10... However I cannot insert custom DNS addresses... I am accessing Network, then at my wireless connection, no matter how many times I press the "options" tab at the low right corner (I am not using english Ubuntu, so the button might have a different name), nothing happens. Any ideas why or how should I configure my OpenDNS connection?

    Read the article

  • Upgrade from 12.04 to 12.10 Failed due to network troubles

    - by user99100
    Every time I try and upgrade from 12.04 to 12.10 it keeps telling me that there are network problems. Is this a problem on "the other" side? My upgrade got through downloading half the packages last night but then went into sleep and then failed. Could it have something to do with that? Would I need to open a terminal and clear a "cache" (no idea what I'm talking about here) Thanks (Solved now, tried it again through out the day and it downloaded and installed fine)

    Read the article

  • Common Network Administrator Tools

    - by No Time
    I would like to make a custom clump of Network Admin packages, to be able to carry on a thumb drive, to administer Debian based machines. Examples of what I would include so far: nmap traceroute vnstat zenmap * I know every situation may be different, but I would like to build a toolbox I could bring everywhere, and am looking for advice on other tools which would work. (If there is a similar question, I am fine being directed there)

    Read the article

  • network problem

    - by TimbooXD
    I was trying to make my IP static from my ubuntu 12.10 laptop but I made a mistake and after restarting with sudo /etc/init.d/networking restart Ubuntu crashed. So after restart my system it said that it couldn't use the network. Now I need to reconnect to the Internet and make my ip static. It is for a server and that isn't the problem, but how can I reconnect and make my ip static and which data I need to know?

    Read the article

  • Sending mail to local address crashes web server (sendmail)

    - by deceze
    When trying to send mail automatically from a script at example.com via PHP's mail() to [email protected], the Apache server throws an Internal Error. I believe internally it is configured to use sendmail. The message gets dropped into ~/dead.letter and the general error log reads: [Wed May 12 11:26:45 2010] [error] [client xxx.xxx.xxx.xxx] malformed header from script. Bad header=/home/example/dead.letter... S: /home/example/www/test.php Trying any other address, not @example.com, works just fine. I have googled and serverfaulted for solutions, but they all require to edit configuration files in /etc/mail and similar system places, which is not an option, since this problem occurs on a shared host in which I only have access to ~/. Does anyone have a suggestion?

    Read the article

  • drupal what if we have designed a content type with existing fields

    - by rakeshakurathi
    i have small problem.... i have one content type say cars with various fields, say more than 30 , user can create the content types... now i would like to show only few fields in different phases,is there any possibility to do that. more explantaion:- user may enter the car model and car details in the first page and upload images in second page.(say a popup in the block) is this is possible ? i m newbie to drupal, i would like to do this kind of data updation, i though with designing a one more content type with the existing fields, can any one explain this issue... what if i design a content type car1 with same fields say(file uplaod) in car content type.

    Read the article

  • Exception when programmatically copying content type

    - by BeraCim
    Hi all: I'm getting exceptions when copying content type from one web to another: foreach (SPContentType destinationWebCt in destinationWeb.ContentTypes) { destinationWeb.ContentTypes.Add(existingWebCt); destinationWeb.Update(); } existingWebCt is the content type from another web e.g. /Site/Web. destinationWeb is the web that I wish to copy the content type to e.g. /Site/DestinationWeb. I got the SPException that says something like could not copy content type to scope. Then I decided to replace all "ContentTypes" to "AvailableContentTypes", but then I got the SPException saying this collcetion could not be modified. So how can I copy a content type to another web? Thanks.

    Read the article

  • Communicate between content script and options page

    - by Gaurang Tandon
    I have seen many questions already and all are about background page to content script. Summary My extension has an options page, and a content script. The content script handles the storage functionality (chrome.storage manipulation). Whenever, a user changes a setting in the options page, I want to send a message to the content script to store the new data. My code: options.js var data = "abcd"; // let data chrome.tabs.query({ active: true, currentWindow: true }, function (tabs) { chrome.tabs.sendMessage(tabs[0].id, "storeData:" + data, function(response){ console.log(response); // gives undefined :( }); }); content script js chrome.runtime.onMessage.addListener(function(request, sender, sendResponse) { // not working }); My question: Why isn't the approach not working? Is there any other (better) approach for this procedure.

    Read the article

  • Exception when programmatically copying content type

    - by BeraCim
    Hi all: I'm getting exceptions when copying content type from one web to another: foreach (SPContentType destinationWebCt in destinationWeb.ContentTypes) { destinationWeb.ContentTypes.Add(existingWebCt); destinationWeb.Update(); } existingWebCt is the content type from another web e.g. /Site/Web. destinationWeb is the web that I wish to copy the content type to e.g. /Site/DestinationWeb. I got the SPException that says something like content type can not be added outside of its scope. Then I decided to replace all "ContentTypes" to "AvailableContentTypes", but then I got the SPException saying this collcetion could not be modified. So how can I copy a content type to another web? Thanks.

    Read the article

  • 2 sites each in a different country with 1 set of content (cloaking)

    - by Greg
    Hi, I have a question re: cloaking. I have a friend who has a business in Canada and the UK. Currently the .ca site is hosted on Godaddy. The co.uk domain is registered (with uk ip address) with domainmonster and is using a cloaked/framed redirect to the .ca site. As a result (my assumption) the .ca site is indexed fine by google, the .co.uk is not. The content is generic for both sites. How do I point the .co.uk site directly to the content independently (preferably without duplicating the content hosting in the UK), so that for instance if the .ca domain was taken away altogether the .co.uk domain would remain an entity in itself from Google's point of view? Does Google index a generic set of content and then associate different country domains with that content? I hope I have explained this ok. Thanks, Greg

    Read the article

  • Access &lt;body element from content page via a nested master page

    - by danwellman
    All I want to do is access the <body element from the code-behind of a content page and add a class name to it. I have a top-level master page with the <body element in it. Then I have a nested master page which is the master page for the content page. From the code behind of the content page I want to add a class name to the body element. That's all. I have this in the top-level master: <body id="bodyNode" runat="server"> I added this to the code-behind for the content page: Master.bodyNode.Attributes.add("class", "home-page"); And I get a message that: System.Web.UI.MasterPage' does not contain a definition for 'bodyNode If I add this to the aspx content page: <% @ MasterType VirtualPath="~/MasterPage.master"%> The message then changes to: bodyNode is inaccessible due to its protection level Please advise, I've wasted like 2 hours on what feels like something that should be really simple to do :(

    Read the article

  • jQuery Dialog open hides background content in IE7

    - by JoshReedSchramm
    Hey all, I'm working on an application where the page layout is essentially a bunch of absolutely positioned containers. There is a header, sidebar and content area. The content area has its own header and footer which are fixed on the screen. The main part of the content area scrolls but nothing else does. So, we are using jQuery dialog for popup boxes and in IE7 with this new layout when i open a dialog the rest of my content in the background vanishes. I'm left with the page header and the footer of the content area and that's it. In Firefox and believe it or not IE6 it works fine. We are using ie7.js in ie6 though to make things not suck. Anyone have any clue why opening a dialog in jquery would screw up absolute positioning and make stuff vanish?

    Read the article

  • android : Dynamically changing the content of tab

    - by Jomia
    I want to change the content of a tab? when tha tab is created i set the content of the tab by setContent() method. But if I click again, I want to change the content that means change to another activity. I used setOnTabChangedListener() method, but I am not sure about how to set the content to another intent? Resources res = getResources(); TabHost tabHost=getTabHost(); tabHost.addTab(tabHost.newTabSpec("tab1").setIndicator("HOME").setContent(new Intent(getBaseContext(),homeGroup.class))); tabHost.addTab(tabHost.newTabSpec("tab2").setIndicator("ABOUT US").setContent(new Intent(getBaseContext(),aboutusGroup.class))); tabHost.setCurrentTab(0); tabHost.setOnTabChangedListener(new OnTabChangeListener() { @Override public void onTabChanged(String tabId) { //here i want to set the content of each tab to another intent // for 'tab1', change to home.class // for 'tab2', change to aboutus.class //how to set these? } }); Please help me.. Thank you..

    Read the article

  • Loading external content with jquery or iframe?

    - by nailuenlue
    Hiho, There's an existing website that i need to include into another site which goes like this: a.mysite.com and i need to fetch content from this site in my www.mysite.com website... As i need to access the content of the iframe the Same origin policy produces a problem here. What i did was to configure mod_proxy on Apache to proxy pass all requests from www.mysite.com/a to a.mysite.com This will work fine...but my problem is that im not sure what the best way would be to include those pages. 1. Idea As the content of the iframe is a full featured site with a top navigation...left navigation etc....i would need to change the page template to only show the content box to be able to integrate that page in the iframe. 2. Idea I could just load the DIV where the content lies through JQuery.load() and integrate it into my site. What is the best way to accomplish such a task? How bad is both ideas from the SEO point of view?

    Read the article

  • jQuery reader() on downloaded content.

    - by David
    Hi all! jQuery.ready() allows us to wait for the construction of the webpage. Recently it has been added support to wait until CSS files are loaded. I would like to know if that feature can be used for downloaded content, because I fetch content via $.ajax() that holds CSS references and I would like to retrieve the content of the CSS before working with the retrieved content. Fetch with $.ajax() the html. -- Wait until all CSS is downloaded. Show to fetched content (already css'ed). Thank you very much for your help.

    Read the article

  • ASP.NET GZip Encoding Caveats

    - by Rick Strahl
    GZip encoding in ASP.NET is pretty easy to accomplish using the built-in GZipStream and DeflateStream classes and applying them to the Response.Filter property.  While applying GZip and Deflate behavior is pretty easy there are a few caveats that you have watch out for as I found out today for myself with an application that was throwing up some garbage data. But before looking at caveats let’s review GZip implementation for ASP.NET. ASP.NET GZip/Deflate Basics Response filters basically are applied to the Response.OutputStream and transform it as data is written to it through the ASP.NET Response object. So a Response.Write eventually gets written into the output stream which if a filter is also written through the filter stream’s interface. To perform the actual GZip (and Deflate) encoding typically used by Web pages .NET includes the GZipStream and DeflateStream stream classes which can be readily assigned to the Repsonse.OutputStream. With these two stream classes in place it’s almost trivially easy to create a couple of reusable methods that allow you to compress your HTTP output. In my standard WebUtils utility class (from the West Wind West Wind Web Toolkit) created two static utility methods – IsGZipSupported and GZipEncodePage – that check whether the client supports GZip encoding and then actually encodes the current output (note that although the method includes ‘Page’ in its name this code will work with any ASP.NET output). /// <summary> /// Determines if GZip is supported /// </summary> /// <returns></returns> public static bool IsGZipSupported() { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (!string.IsNullOrEmpty(AcceptEncoding) && (AcceptEncoding.Contains("gzip") || AcceptEncoding.Contains("deflate"))) return true; return false; } /// <summary> /// Sets up the current page or handler to use GZip through a Response.Filter /// IMPORTANT: /// You have to call this method before any output is generated! /// </summary> public static void GZipEncodePage() { HttpResponse Response = HttpContext.Current.Response; if (IsGZipSupported()) { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (AcceptEncoding.Contains("deflate")) { Response.Filter = new System.IO.Compression.DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "deflate"); } else { Response.Filter = new System.IO.Compression.GZipStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "gzip"); } } } As you can see the actual assignment of the Filter is as simple as: Response.Filter = new DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); which applies the filter to the OutputStream. You also need to ensure that your response reflects the new GZip or Deflate encoding and ensure that any pages that are cached in Proxy servers can differentiate between pages that were encoded with the various different encodings (or no encoding). To use this utility function now is trivially easy: In any ASP.NET code that wants to compress its Response output you simply use: protected void Page_Load(object sender, EventArgs e) { WebUtils.GZipEncodePage(); Entry = WebLogFactory.GetEntry(); var entries = Entry.GetLastEntries(App.Configuration.ShowEntryCount, "pk,Title,SafeTitle,Body,Entered,Feedback,Location,ShowTopAd", "TEntries"); if (entries == null) throw new ApplicationException("Couldn't load WebLog Entries: " + Entry.ErrorMessage); this.repEntries.DataSource = entries; this.repEntries.DataBind(); } Here I use an ASP.NET page, but the above WebUtils.GZipEncode() method call will work in any ASP.NET application type including HTTP Handlers. The only requirement is that the filter needs to be applied before any other output is sent to the OutputStream. For example, in my CallbackHandler service implementation by default output over a certain size is GZip encoded. The output that is generated is JSON or XML and if the output is over 5k in size I apply WebUtils.GZipEncode(): if (sbOutput.Length > GZIP_ENCODE_TRESHOLD) WebUtils.GZipEncodePage(); Response.ContentType = ControlResources.STR_JsonContentType; HttpContext.Current.Response.Write(sbOutput.ToString()); Ok, so you probably get the idea: Encoding GZip/Deflate content is pretty easy. Hold on there Hoss –Watch your Caching Or is it? There are a few caveats that you need to watch out for when dealing with GZip content. The fist issue is that you need to deal with the fact that some clients don’t support GZip or Deflate content. Most modern browsers support it, but if you have a programmatic Http client accessing your content GZip/Deflate support is by no means guaranteed. For example, WinInet Http clients don’t support GZip out of the box – it has to be explicitly implemented. Other low level HTTP clients on other platforms too don’t support GZip out of the box. The problem is that your application, your Web Server and Proxy Servers on the Internet might be caching your generated content. If you return content with GZip once and then again without, either caching is not applied or worse the wrong type of content is returned back to the client from a cache or proxy. The result is an unreadable response for *some clients* which is also very hard to debug and fix once in production. You already saw the issue of Proxy servers addressed in the GZipEncodePage() function: // Allow proxy servers to cache encoded and unencoded versions separately Response.AppendHeader("Vary", "Content-Encoding"); This ensures that any Proxy servers also check for the Content-Encoding HTTP Header to cache their content – not just the URL. The same thing applies if you do OutputCaching in your own ASP.NET code. If you generate output for GZip on an OutputCached page the GZipped content will be cached (either by ASP.NET’s cache or in some cases by the IIS Kernel Cache). But what if the next client doesn’t support GZip? She’ll get served a cached GZip page that won’t decode and she’ll get a page full of garbage. Wholly undesirable. To fix this you need to add some custom OutputCache rules by way of the GetVaryByCustom() HttpApplication method in your global_ASAX file: public override string GetVaryByCustomString(HttpContext context, string custom) { // Override Caching for compression if (custom == "GZIP") { string acceptEncoding = HttpContext.Current.Response.Headers["Content-Encoding"]; if (string.IsNullOrEmpty(acceptEncoding)) return ""; else if (acceptEncoding.Contains("gzip")) return "GZIP"; else if (acceptEncoding.Contains("deflate")) return "DEFLATE"; return ""; } return base.GetVaryByCustomString(context, custom); } In a page that use Output caching you then specify: <%@ OutputCache Duration="180" VaryByParam="none" VaryByCustom="GZIP" %> To use that custom rule. It’s all Fun and Games until ASP.NET throws an Error Ok, so you’re up and running with GZip, you have your caching squared away and your pages that you are applying it to are jamming along. Then BOOM, something strange happens and you get a lovely garbled page that look like this: Lovely isn’t it? What’s happened here is that I have WebUtils.GZipEncode() applied to my page, but there’s an error in the page. The error falls back to the ASP.NET error handler and the error handler removes all existing output (good) and removes all the custom HTTP headers I’ve set manually (usually good, but very bad here). Since I applied the Response.Filter (via GZipEncode) the output is now GZip encoded, but ASP.NET has removed my Content-Encoding header, so the browser receives the GZip encoded content without a notification that it is encoded as GZip. The result is binary output. Here’s what Fiddler says about the raw HTTP header output when an error occurs when GZip encoding was applied: HTTP/1.1 500 Internal Server Error Cache-Control: private Content-Type: text/html; charset=utf-8 Date: Sat, 30 Apr 2011 22:21:08 GMT Content-Length: 2138 Connection: close ?`I?%&/m?{J?J??t??` … binary output striped here Notice: no Content-Encoding header and that’s why we’re seeing this garbage. ASP.NET has stripped the Content-Encoding header but left our filter intact. So how do we fix this? In my applications I typically have a global Application_Error handler set up and in this case I’ve been using that. One thing that you can do in the Application_Error handler is explicitly clear out the Response.Filter and set it to null at the top: protected void Application_Error(object sender, EventArgs e) { // Remove any special filtering especially GZip filtering Response.Filter = null; … } And voila I get my Yellow Screen of Death or my custom generated error output back via uncompressed content. BTW, the same is true for Page level errors handled in Page_Error or ASP.NET MVC Error handling methods in a controller. Another and possibly even better solution is to check whether a filter is attached just before the headers are sent to the client as pointed out by Adam Schroeder in the comments: protected void Application_PreSendRequestHeaders() { // ensure that if GZip/Deflate Encoding is applied that headers are set // also works when error occurs if filters are still active HttpResponse response = HttpContext.Current.Response; if (response.Filter is GZipStream && response.Headers["Content-encoding"] != "gzip") response.AppendHeader("Content-encoding", "gzip"); else if (response.Filter is DeflateStream && response.Headers["Content-encoding"] != "deflate") response.AppendHeader("Content-encoding", "deflate"); } This uses the Application_PreSendRequestHeaders() pipeline event to check for compression encoding in a filter and adjusts the content accordingly. This is actually a better solution since this is generic – it’ll work regardless of how the content is cleaned up. For example, an error Response.Redirect() or short error display might get changed and the filter not cleared and this code actually handles that. Sweet, thanks Adam. It’s unfortunate that ASP.NET doesn’t natively clear out Response.Filters when an error occurs just as it clears the Response and Headers. I can’t see where leaving a Filter in place in an error situation would make any sense, but hey - this is what it is and it’s easy enough to fix as long as you know where to look. Riiiight! IIS and GZip I should also mention that IIS 7 includes good support for compression natively. If you can defer encoding to let IIS perform it for you rather than doing it in your code by all means you should do it! Especially any static or semi-dynamic content that can be made static should be using IIS built-in compression. Dynamic caching is also supported but is a bit more tricky to judge in terms of performance and footprint. John Forsyth has a great article on the benefits and drawbacks of IIS 7 compression which gives some detailed performance comparisons and impact reviews. I’ll post another entry next with some more info on IIS compression since information on it seems to be a bit hard to come by. Related Content Built-in GZip/Deflate Compression in IIS 7.x HttpWebRequest and GZip Responses © Rick Strahl, West Wind Technologies, 2005-2011Posted in ASP.NET   IIS7  

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >