Search Results

Search found 2558 results on 103 pages for 'significant digits'.

Page 51/103 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • After turning my monitor off and on, it will display only a white screen

    - by Narf the Mouse
    About a month after installing a new graphics card, I started encountering a rather frustrating problem. Namely, if I turn my monitor off for any significant length of time, then turn it back on, it displays only a white screen. Previously, restarting could fix the problem. However, after leaving the computer off last night, the problem persists. An internet search turned up this site; however, the monitor cable is not loose. As for the insides of the monitor - Well, I could poke around, but I risk making it worse if it's not the monitor. Any such instructions should be clear, detailed and include pictures. Further updates as events warrant.

    Read the article

  • Cannot connect to internet with Clearwire modem

    - by ide
    I'm currently using a Motorola WiMAX modem (CPEi 25725) and cannot connect to the internet. I can connect to the modem at 192.168.15.1 and check its status. It says that it has good/excellent connectivity to the internet and shows all five signal bars. Additionally it has sent and received some WiMAX packets so I believe it is connected to a tower. I'm at a loss for what the problem is. Unplugging the modem, restarting it from software, and restarting my computer (Windows 7) have not helped. Windows still reports that it is not connected to the internet. Alternatively, could this be an ISP issue? I have heard that Clearwire is a not-so-reputable ISP that blocks VoIP, and I was using Skype recently. EDIT: I called Clear's tech support and apparently their network is having significant problems at the moment. Guess there's nothing an end-user can do about it.

    Read the article

  • Removing 'bundled' software to make a slow laptop faster?

    - by spdegabrielle
    My brother-in-law has a cheap HP laptop used by his kids for schoolwork. It had got into a bit of a state and was running slowly with some dubious software. I removed a bunch of stuff that had been installed, that was obviously not required (three different driver scanners!), had been downloaded in error or looked like malware. I also disabled as many 'start on login' apps as I could and removed AVG replacing it with MSE. (AVG is uninstalled and replaced with MSE after failing to detect malware) What remains is a significant quantity of bundled HP and 'nero backup' software, including a HP restore utility (apparently something like the osx hidden partition restore), the trackpad driver. is there anything else I can do to breath a little more life into the old 'celeron' laptop? Should I bit the bullet and just put win8 on it? Will the trackpad still work?

    Read the article

  • Disk fragmentation when dealing with many small files

    - by Zorlack
    On a daily basis we generate about 3.4 Million small jpeg files. We also delete about 3.4 Million 90 day old images. To date, we've dealt with this content by storing the images in a hierarchical manner. The heriarchy is something like this: /Year/Month/Day/Source/ This heirarchy allows us to effectively delete days worth of content across all sources. The files are stored on a Windows 2003 server connected to a 14 disk SATA RAID6. We've started having significant performance issues when writing-to and reading-from the disks. This may be due to the performance of the hardware, but I suspect that disk fragmentation bay be a culprit at well. Some people have recommended storing the data in a database, but I've been hesitant to do this. An other thought was to use some sort of container file, like a VHD or something. Does anyone have any advice for mitigating this kind of fragmentation?

    Read the article

  • Can Visual Studio track the "size" or "severity" of my changes in TFS?

    - by anaximander
    I'm working on a sizeable project using VS2012 and TFS (also 2012, I think - I didn't set up the server). A lot of my recent tasks have required making very small changes to a lot of files, so I'm quite used to seeing a lot of items in my Pending Changes list. Is there a way to have VS and/or TFS track how much has been changed and let me know when the differences are becoming significant? Similarly, is there a way to quickly highlight where the major changes are when you get the latest version from TFS? It'd really help with tracking down where certain changes have been made without having to go through and compare every file - the difference highlighting tool might be nice, but when you have to use it on a dozen files to find the block you're looking for, you start to wonder if there's a faster way...

    Read the article

  • InnoDb Overhead?

    - by Rimary
    I just converted several large tables from MyISAM to InnoDB. When I view the tables in phpMyAdmin, they are showing a significant amount of overhead (One table has 6.8GB). Optimizing the tables (which isn't a supported command on InnoDB) has no affect like it does on MyISAM. Is this a result of InnoDB having the ever growing data file that never returns space even after deletes? If that's the case, I've never seen overhead like this before from other InnoDB tables. Is there a way to clean this up? Edit: Here are the things I've tried (with no success): Optimize Table Reorder table by primary key Defragment table

    Read the article

  • Password not accepted when resuming from sleep

    - by Comrade
    My HP Pavilion dv-series notebook will not accept the user account password for the logged in user when resuming from sleep - message returned is Incorrect Password. Simply selecting the Switch User option reloads the login screen and allows me to log in as the same user with the same password that was originally rejected. And, yes I've tried it more than one (hundred) times in case you were thinking it's just a case of slippery fingers. Another interesting point is that it appears to be independent of the software running on the machine. Since the issue first appeared, I have done two clean installs where all HD partitions were wiped and new ones created during fresh installation of the OS. The first such clean install was of Ubuntu (Lucid) 10.04 amd-64, the second of Win 7 Pro 64 (from boxed disc and activated post install). Exactly the same symptoms, described above, are exhibited on both platforms. Have engaged in significant amount of Googling an come up empty so any ideas are welcome.

    Read the article

  • My HP ProBook 4520s running Windows 7 hangs intermittently

    - by Rory Alsop
    I have not been able to trace a consistent cause, but a few times a day my laptop will hang for up to about a minute. I can still move the cursor which displays as the wait icon for whatever application I was last in, but cannot carry out any other actions. Unfortunately I can't Ctrl+Alt+Del to bring up a task manager and I just have to wait. I can't pin it down to a particular application either, but generally I have either office apps, a browser or other tools open. I'm tempted to think it may be network timeout on something, as I can't think of anything else which would delay for that long with such a significant impact, but as I'm more a Unix person I thought it would be worth asking here.

    Read the article

  • Host data transfer limit calculations and network protocol headers

    - by UpTheCreek
    OK, this might be a really stupid question, but... I'm building a web app that utilises websockets. There's fairly rapid messaging going on, so I've been taking a look at the network traffic with wireshark, to see if there's any way of reducing the amount of data we are sending over the wire, and hence costs. A typical message has approx 150 byte data payload, and according to wireshark the lower layer stuff takes up about: Ethernet: 14 bytes IP: 20 Bytes TCP: 20 Bytes My question is, are these network headers included in data transfer calculations? What about TCP ACK messages? (another 54 bytes according to wireshark) This may seem petty, but because we have so much messaging going on, and because the payload is a similar size to these headers, it's significant.

    Read the article

  • Why does pinging 192.168.072 (only 2 dots) return a response from 192.168.0.58?

    - by George Duckett
    I mistakenly missed the dot off of an IP address and typed in 192.168.072. To my surprise I connected to a machine at 192.168.0.58 If I ping 192.168.072 I get responses from 192.168.0.58. Why is this? I'm on a Windows PC on a Windows domain. If I ping 192.168.72 I get a response from 192.168.0.72, so it seems the 0 in 072 (in my original mistake) is significant. This question was a Super User Question of the Week. Read the blog entry for more details or contribute to the blog yourself

    Read the article

  • System Center 2012 R2 System Discovery Network Utilization

    - by AtomicReaction
    I'm in charge of a deployment of Microsoft System Center Configuration Manager 2012 R2. Currently, I'm working through the discovery methods and trying to decide how to enable automatic discovery of systems and users. On Microsoft's documentation, they warn that Configuration Manager Automatic Discovery traffic can get pretty significant if you aren't careful in your implementation. Can anyone who has used this give me some information on how much traffic I should expect? We currently have around 1000 computers and 4000 user accounts in Active Directory. Thanks!

    Read the article

  • Disable address bar in Internet Explorer 9

    - by token
    I'm trying to disable the address bar in IE9. I've done a significant amount of searching on this and just can't seem to find a way to make it happen. A lot of web resources discuss how to do it in IE8, but not IE9. The reason you might ask? I have an application being hosted in a remote desktop farm that links to web pages outside of the application into Internet Explorer. I need to ensure users are limited to just going to the pages the program pushes them to. I realize I could use a proxy server to limit where they can go, but I'm trying to find a really simple way to just disable the address bar instead. I can't use Kiosk mode because it puts the browser into full screen mode. This won't work for my situation as I need to give users what appears to be a regular browsing experience without an address bar.

    Read the article

  • How to synchronize tasks on multiple computers

    - by SysGen
    I would like to be able to run similar tasks on several computers that must be precisely synced. More specifically I need 4 laptops to be synced, probably over local network, and I need to use one of them to start a task (play a video) on all of them at the same time (different video files on different laptops). All of them are running Windows. Is there a third-party software or any easier way to do so over LAN without significant delay? I need virtually no response time - if there would be a single video streaming on all laptops then people should not recognize the delay.

    Read the article

  • Okular (on Ubuntu 9.10) prints multiple pages per sheet (n-up) very small

    - by user23884
    I'm trying to print a set of beamer slides with multiple slides per page (4-up or 6-up). When I select 4 pages or 6 pages per sheet in the Okular print dialog, the pages print quite small (perhaps even tiny -- about 1.75" by 1.25") and leave significant white-space on the page. I can get around this behavior by using the pdfnup utility (in the pdfjam package); which will correctly generate a 4- or 6-up pdf file but it's annoying to generate a second pdf file when I should be able to accomplish this task from the print dialog. Details: Ubuntu 9.10 (Karmic), 64-bit, Color Postscript printer.

    Read the article

  • compiling the linux kernel

    - by user482819
    Just for learning, I have recompiled the linux kernel with different options, installed and boot from it. It was both instructive and straight forward. However, I was overwhelmed by the big number of options available. My questions are: 1.- Does it make sense to spend time trying to optimize the linux kernel for my particular laptop? Will it make a significant improvement? 2.- Is there any tool that can read the configuration of my computer and suggest a config? Thanks, H

    Read the article

  • In what platforms/browsers does Adobe Flash come preinstalled?

    - by Michael Haren
    In what platforms/browsers does Adobe Flash come preinstalled? I'm pretty sure neither FireFox nor Safari include it on any platform but Chrome on Windows does (not sure about Mac or Linux). What about IE on a completely vanilla Win XP system? Or Win 7? Maybe mfgs like Dell add it if MS doesn't? Perhaps we can find (or build) a comprehensive list. Considering that Flash is a significant piece of the web platform, I hope I've asked this on the right place.

    Read the article

  • OS Isolation: Virtualization or Dual-Boot Duplication, a General How To?

    - by Mr_CryptoPrime
    I want to isolate my windows 7 operating system and I have looked into virtualization. This should work with Linux, however, I do want to still have a way to run windows 7 securely, but without significant performance loss, thus eliminating virtualization for that. I know that you can dual boot because I currently do so with my XP/Linux system. Is there a way that I can duplicate my windows 7 system so I can select one at bootup? This way I can ensure that each OS is isolated and not worry about performance loss. However, I am having a lot of trouble finding a solid method for OS duplication?! Is this even possible or must I buy two versions of win7 and somehow install them separately? Any information regarding this would be helpful, thanks! Essentially I want, Two instances of win7 (not necessarily simultaneously running) Each are isolated from one another so that a security breach in one doesn't affect the other. There is no performance loss in either from doing so

    Read the article

  • Speed up connection to MySQL

    - by Leonick
    So here's one for you. Any idea on a way to shorten the time it takes to connect to a MySQL database? The reason I'm wondering is because I find that just connecting to the DB adds just over a second to the rendering of the page and that seems a bit long considering Apache and MySQL is running on the same machine and the mysqli_connect is connecting to localhost. It's just such a shame when the connection takes a second while any query I end up doing won't add any significant amount of time to the render/load time. Any ways to shorten the time it takes to open a connection?

    Read the article

  • Is SQL Server DRI (ON DELETE CASCADE) slow?

    - by Aaronaught
    I've been analyzing a recurring "bug report" (perf issue) in one of our systems related to a particularly slow delete operation. Long story short: It seems that the CASCADE DELETE keys were largely responsible, and I'd like to know (a) if this makes sense, and (b) why it's the case. We have a schema of, let's say, widgets, those being at the root of a large graph of related tables and related-to-related tables and so on. To be perfectly clear, deleting from this table is actively discouraged; it is the "nuclear option" and users are under no illusions to the contrary. Nevertheless, it sometimes just has to be done. The schema looks something like this: Widgets | +--- Anvils (1:1) | | | +--- AnvilTestData (1:N) | +--- WidgetHistory (1:N) | +--- WidgetHistoryDetails (1:N) Nothing too scary, really. A Widget can be different types, an Anvil is a special type, so that relationship is 1:1 (or more accurately 1:0..1). Then there's a large amount of data - perhaps thousands of rows of AnvilTestData per Anvil collected over time, dealing with hardness, corrosion, exact weight, hammer compatibility, usability issues, and impact tests with cartoon heads. Then every Widget has a long, boring history of various types of transactions - production, inventory moves, sales, defect investigations, RMAs, repairs, customer complaints, etc. There might be 10-20k details for a single widget, or none at all, depending on its age. So, unsurprisingly, there's a CASCADE DELETE relationship at every level here. If a Widget needs to be deleted, it means something's gone terribly wrong and we need to erase any records of that widget ever existing, including its history, test data, etc. Again, nuclear option. Relations are all indexed, statistics are up to date. Normal queries are fast. The system tends to hum along pretty smoothly for everything except deletes. Getting to the point here, finally, for various reasons we only allow deleting one widget at a time, so a delete statement would look like this: DELETE FROM Widgets WHERE WidgetID = @WidgetID Pretty simple, innocuous looking delete... that takes over 2 minutes to run, for a widget with no data! After slogging through execution plans I was finally able to pick out the AnvilTestData and WidgetHistoryDetails deletes as the sub-operations with the highest cost. So I experimented with turning off the CASCADE (but keeping the actual FK, just setting it to NO ACTION) and rewriting the script as something very much like the following: DECLARE @AnvilID int SELECT @AnvilID = AnvilID FROM Anvils WHERE WidgetID = @WidgetID DELETE FROM AnvilTestData WHERE AnvilID = @AnvilID DELETE FROM WidgetHistory WHERE HistoryID IN ( SELECT HistoryID FROM WidgetHistory WHERE WidgetID = @WidgetID) DELETE FROM Widgets WHERE WidgetID = @WidgetID Both of these "optimizations" resulted in significant speedups, each one shaving nearly a full minute off the execution time, so that the original 2-minute deletion now takes about 5-10 seconds - at least for new widgets, without much history or test data. Just to be absolutely clear, there is still a CASCADE from WidgetHistory to WidgetHistoryDetails, where the fanout is highest, I only removed the one originating from Widgets. Further "flattening" of the cascade relationships resulted in progressively less dramatic but still noticeable speedups, to the point where deleting a new widget was almost instantaneous once all of the cascade deletes to larger tables were removed and replaced with explicit deletes. I'm using DBCC DROPCLEANBUFFERS and DBCC FREEPROCCACHE before each test. I've disabled all triggers that might be causing further slowdowns (although those would show up in the execution plan anyway). And I'm testing against older widgets, too, and noticing a significant speedup there as well; deletes that used to take 5 minutes now take 20-40 seconds. Now I'm an ardent supporter of the "SELECT ain't broken" philosophy, but there just doesn't seem to be any logical explanation for this behaviour other than crushing, mind-boggling inefficiency of the CASCADE DELETE relationships. So, my questions are: Is this a known issue with DRI in SQL Server? (I couldn't seem to find any references to this sort of thing on Google or here in SO; I suspect the answer is no.) If not, is there another explanation for the behaviour I'm seeing? If it is a known issue, why is it an issue, and are there better workarounds I could be using?

    Read the article

  • Why don't I just build the whole web app in Javascript and Javascript HTML Templates?

    - by viatropos
    I'm getting to the point on an app where I need to start caching things, and it got me thinking... In some parts of the app, I render table rows (jqGrid, slickgrid, etc.) or fancy div rows (like in the New Twitter) by grabbing pure JSON and running it through something like Mustache, jquery.tmpl, etc. In other parts of the app, I just render the info in pure HTML (server-side HAML templates), and if there's searching/paginating, I just go to a new URL and load a new HTML page. Now the problem is in caching and maintainability. On one hand I'm thinking, if everything was built using Javascript HTML Templates, then my app would serve just an HTML layout/shell, and a bunch of JSON. If you look at the Facebook and Twitter HTML source, that's basically what they're doing (95% json/javascript, 5% html). This would make it so my app only needed to cache JSON (pages, actions, and/or records). Which means you'd hit the cache no matter if you were some remote api developer accessing a JSON api, or the strait web app. That is, I don't need 2 caches, one for the JSON, one for the HTML. That seems like it'd cut my cache store down in half, and streamline things a little bit. On the other hand, I'm thinking, from what I've seen/experienced, generating static HTML server-side, and caching that, seems to be much better performance wise cross-browser; you get the graphics instantly and don't have to wait that split-second for javascript to render it. StackOverflow seems to do everything in plain HTML, and you can tell... everything appears at once. Notice how though on twitter.com, the page is blank for .5-1 seconds, and the page chunks in: the javascript has to render the json. The downside with this is that, for anything dynamic (like endless scrolling, or grids), I'd have to create javascript templates anyway... so now I have server-side HAML templates, client-side javascript templates, and a lot more to cache. My question is, is there any consensus on how to approach this? What are the benefits and drawbacks from your experience of mixing the two versus going 100% with one over the other? Update: Some reasons that factor into why I haven't yet made the decision to go with 100% javascript templating are: Performance. Haven't formally tested, but from what I've seen, raw html renders faster and more fluidly than javascript-generated html cross-browser. Plus, I'm not sure how mobile devices handle dynamic html performance-wise. Testing. I have a lot of integration tests that work well with static HTML, so switching to javascript-only would require 1) more focused pure-javascript testing (jasmine), and 2) integrating javascript into capybara integration tests. This is just a matter of time and work, but it's probably significant. Maintenance. Getting rid of HAML. I love HAML, it's so easy to write, it prints pretty HTML... It makes code clean, it makes maintenance easy. Going with javascript, there's nothing as concise. SEO. I know google handles the ajax /#!/path, but haven't grasped how this will affect other search engines and how older browsers handle it. Seems like it'd require a significant setup.

    Read the article

  • Oracle VM and JRockit Virtual Edition: Oracle Introduces Java Virtualization Solution for Oracle(R)

    - by adam.hawley
    Since the beginning, we've been talking to customers about how our approach to virtualization is different and more powerful than any other company because Oracle has the "full-stack" of software (and even hardware these days!) to work with to create more comprehensive, more powerful solutions. Having the virtualization layer, two enterprise class operating systems in Solaris and Enterprise Linux, and the leading enterprise software in nearly every layer of the data center stack, allows us to not just do virtualization for virtualization's sake but rather to provide complete virtualization solutions focused on making enterprise software easier to deploy, easier to manage, and easier to support through integration up and down the stack. Today, we announced the availability of a significant demonstration of that capability by announcing a WebLogic Suite option that permits the Oracle WebLogic Server 11g to run on a Java JVM (JRockit Virtual Edition) that itself runs directly on the Oracle VM Server for x86 / x64 without needing any operating system. Why would you want that? Better performance and better consolidation density, not to mention great security due to a lower "attack surface area". Oracle also announced the Oracle Virtual Assembly Builder product. Oracle Virtual Assembly Builder provides a framework for automatically capturing the configuration of existing software components and packaging them as self-contained building blocks known as appliances. So you know that complex application you've tweaked on your physical servers (or on other virtual environments for that matter)?  Virtual Assembly Builder will allow the automated collection of all the configuration data for the various application components that make up that multi-tier application and then use the information to create and package each component as a virtual machine so that the application can be deployed in your Oracle VM virtualization environment quickly and easily and just as it was configured it in your original environment. A slick, drag-and-drop GUI also serves as a powerful, intuitive interface for viewing and editing your assembly as needed.No one else can do complete virtualization solutions the way Oracle can and I think these offerings show what's possible when you have the right resources for elegantly solving the larger problems in the data center rather than just having to make-do with tools that are only operating at one layer of the stack. For more information, read the press release including the links to more information on various Oracle websites.

    Read the article

  • Ajax Control Toolkit May 2012 Release

    - by Stephen.Walther
    I’m happy to announce the May 2012 release of the Ajax Control Toolkit. This newest release of the Ajax Control Toolkit includes a new file upload control which displays file upload progress. We’ve also added several significant enhancements to the existing HtmlEditorExtender control such as support for uploading images and Source View. You can download and start using the newest version of the Ajax Control Toolkit by entering the following command in the Library Package Manager console in Visual Studio: Install-Package AjaxControlToolkit Alternatively, you can download the latest version of the Ajax Control Toolkit from CodePlex: http://AjaxControlToolkit.CodePlex.com The New Ajax File Upload Control The most requested new feature for the Ajax Control Toolkit (according to the CodePlex Issue Tracker) has been support for file upload with progress. We worked hard over the last few months to create an entirely new file upload control which displays upload progress. Here is a sample which illustrates how you can use the new AjaxFileUpload control: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="01_FileUpload.aspx.cs" Inherits="WebApplication1._01_FileUpload" %> <html> <head runat="server"> <title>Simple File Upload</title> </head> <body> <form id="form1" runat="server"> <div> <ajaxToolkit:ToolkitScriptManager runat="server" /> <ajaxToolkit:AjaxFileUpload id="ajaxUpload1" OnUploadComplete="ajaxUpload1_OnUploadComplete" runat="server" /> </div> </form> </body> </html> The page above includes a ToolkitScriptManager control. This control is required to use any of the controls in the Ajax Control Toolkit because this control is responsible for loading all of the scripts required by a control. The page also contains an AjaxFileUpload control. The UploadComplete event is handled in the code-behind for the page: namespace WebApplication1 { public partial class _01_FileUpload : System.Web.UI.Page { protected void ajaxUpload1_OnUploadComplete(object sender, AjaxControlToolkit.AjaxFileUploadEventArgs e) { // Generate file path string filePath = "~/Images/" + e.FileName; // Save upload file to the file system ajaxUpload1.SaveAs(MapPath(filePath)); } } } The UploadComplete handler saves each uploaded file by calling the AjaxFileUpload control’s SaveAs() method with a full file path. Here’s a video which illustrates the process of uploading a file: Warning: in order to write to the Images folder on a production IIS server, you need Write permissions on the Images folder. You need to provide permissions for the IIS Application Pool account to write to the Images folder. To learn more, see: http://learn.iis.net/page.aspx/624/application-pool-identities/ Showing File Upload Progress The new AjaxFileUpload control takes advantage of HTML5 upload progress events (described in the XMLHttpRequest Level 2 standard). This standard is supported by Firefox 8+, Chrome 16+, Safari 5+, and Internet Explorer 10+. In other words, the standard is supported by the most recent versions of all browsers except for Internet Explorer which will support the standard with the release of Internet Explorer 10. The AjaxFileUpload control works with all browsers, even browsers which do not support the new XMLHttpRequest Level 2 standard. If you use the AjaxFileUpload control with a downlevel browser – such as Internet Explorer 9 — then you get a simple throbber image during a file upload instead of a progress indicator. Here’s how you specify a throbber image when declaring the AjaxFileUpload control: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="02_FileUpload.aspx.cs" Inherits="WebApplication1._02_FileUpload" %> <html> <head id="Head1" runat="server"> <title>File Upload with Throbber</title> </head> <body> <form id="form1" runat="server"> <div> <ajaxToolkit:ToolkitScriptManager ID="ToolkitScriptManager1" runat="server" /> <ajaxToolkit:AjaxFileUpload id="ajaxUpload1" OnUploadComplete="ajaxUpload1_OnUploadComplete" ThrobberID="MyThrobber" runat="server" /> <asp:Image id="MyThrobber" ImageUrl="ajax-loader.gif" Style="display:None" runat="server" /> </div> </form> </body> </html> Notice that the page above includes an image with the Id MyThrobber. This image is displayed while files are being uploaded. I use the website http://AjaxLoad.info to generate animated busy wait images. Drag-And-Drop File Upload If you are using an uplevel browser then you can drag-and-drop the files which you want to upload onto the AjaxFileUpload control. The following video illustrates how drag-and-drop works: Remember that drag-and-drop will not work on Internet Explorer 9 or older. Accepting Multiple Files By default, the AjaxFileUpload control enables you to upload multiple files at a time. When you open the file dialog, use the CTRL or SHIFT key to select multiple files. If you want to restrict the number of files that can be uploaded then use the MaximumNumberOfFiles property like this: <ajaxToolkit:AjaxFileUpload id="ajaxUpload1" OnUploadComplete="ajaxUpload1_OnUploadComplete" ThrobberID="throbber" MaximumNumberOfFiles="1" runat="server" /> In the code above, the maximum number of files which can be uploaded is restricted to a single file. Restricting Uploaded File Types You might want to allow only certain types of files to be uploaded. For example, you might want to accept only image uploads. In that case, you can use the AllowedFileTypes property to provide a list of allowed file types like this: <ajaxToolkit:AjaxFileUpload id="ajaxUpload1" OnUploadComplete="ajaxUpload1_OnUploadComplete" ThrobberID="throbber" AllowedFileTypes="jpg,jpeg,gif,png" runat="server" /> The code above prevents any files except jpeg, gif, and png files from being uploaded. Enhancements to the HTMLEditorExtender Over the past months, we spent a considerable amount of time making bug fixes and feature enhancements to the existing HtmlEditorExtender control. I want to focus on two of the most significant enhancements that we made to the control: support for Source View and support for uploading images. Adding Source View Support to the HtmlEditorExtender When you click the Source View tag, the HtmlEditorExtender changes modes and displays the HTML source of the contents contained in the TextBox being extended. You can use Source View to make fine-grain changes to HTML before submitting the HTML to the server. For reasons of backwards compatibility, the Source View tab is disabled by default. To enable Source View, you need to declare your HtmlEditorExtender with the DisplaySourceTab property like this: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="05_SourceView.aspx.cs" Inherits="WebApplication1._05_SourceView" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html> <head id="Head1" runat="server"> <title>HtmlEditorExtender with Source View</title> </head> <body> <form id="form1" runat="server"> <div> <ajaxToolkit:ToolkitScriptManager ID="ToolkitScriptManager1" runat="server" /> <asp:TextBox id="txtComments" TextMode="MultiLine" Columns="60" Rows="10" Runat="server" /> <ajaxToolkit:HtmlEditorExtender id="HEE1" TargetControlID="txtComments" DisplaySourceTab="true" runat="server" /> </div> </form> </body> </html> The page above includes a ToolkitScriptManager, TextBox, and HtmlEditorExtender control. The HtmlEditorExtender extends the TextBox so that it supports rich text editing. Notice that the HtmlEditorExtender includes a DisplaySourceTab property. This property causes a button to appear at the bottom of the HtmlEditorExtender which enables you to switch to Source View: Note: when using the HtmlEditorExtender, we recommend that you set the DOCTYPE for the document. Otherwise, you can encounter weird formatting issues. Accepting Image Uploads We also enhanced the HtmlEditorExtender to support image uploads (another very highly requested feature at CodePlex). The following video illustrates the experience of adding an image to the editor: Once again, for backwards compatibility reasons, support for image uploads is disabled by default. Here’s how you can declare the HtmlEditorExtender so that it supports image uploads: <ajaxToolkit:HtmlEditorExtender id="MyHtmlEditorExtender" TargetControlID="txtComments" OnImageUploadComplete="MyHtmlEditorExtender_ImageUploadComplete" DisplaySourceTab="true" runat="server" > <Toolbar> <ajaxToolkit:Bold /> <ajaxToolkit:Italic /> <ajaxToolkit:Underline /> <ajaxToolkit:InsertImage /> </Toolbar> </ajaxToolkit:HtmlEditorExtender> There are two things that you should notice about the code above. First, notice that an InsertImage toolbar button is added to the HtmlEditorExtender toolbar. This HtmlEditorExtender will render toolbar buttons for bold, italic, underline, and insert image. Second, notice that the HtmlEditorExtender includes an event handler for the ImageUploadComplete event. The code for this event handler is below: using System.Web.UI; using AjaxControlToolkit; namespace WebApplication1 { public partial class _06_ImageUpload : System.Web.UI.Page { protected void MyHtmlEditorExtender_ImageUploadComplete(object sender, AjaxFileUploadEventArgs e) { // Generate file path string filePath = "~/Images/" + e.FileName; // Save uploaded file to the file system var ajaxFileUpload = (AjaxFileUpload)sender; ajaxFileUpload.SaveAs(MapPath(filePath)); // Update client with saved image path e.PostedUrl = Page.ResolveUrl(filePath); } } } Within the ImageUploadComplete event handler, you need to do two things: 1) Save the uploaded image (for example, to the file system, a database, or Azure storage) 2) Provide the URL to the saved image so the image can be displayed within the HtmlEditorExtender In the code above, the uploaded image is saved to the ~/Images folder. The path of the saved image is returned to the client by setting the AjaxFileUploadEventArgs PostedUrl property. Not surprisingly, under the covers, the HtmlEditorExtender uses the AjaxFileUpload. You can get a direct reference to the AjaxFileUpload control used by an HtmlEditorExtender by using the following code: void Page_Load() { var ajaxFileUpload = MyHtmlEditorExtender.AjaxFileUpload; ajaxFileUpload.AllowedFileTypes = "jpg,jpeg"; } The code above illustrates how you can restrict the types of images that can be uploaded to the HtmlEditorExtender. This code prevents anything but jpeg images from being uploaded. Summary This was the most difficult release of the Ajax Control Toolkit to date. We iterated through several designs for the AjaxFileUpload control – with each iteration, the goal was to make the AjaxFileUpload control easier for developers to use. My hope is that we were able to create a control which Web Forms developers will find very intuitive. I want to thank the developers on the Superexpert.com team for their hard work on this release.

    Read the article

  • How will people upgrade from 12.10 to 14.04 after 13.04 is EOL?

    - by Dave Jones
    Looking at https://wiki.ubuntu.com/Releases 13.04 will reach EOL in January 2014, while 12.10 will reach EOL in April 2014, therefore if a 12.10 user hasn't upgraded to 13.04 and subsequently to 13.10, there will be a 3 month period where a 12.10 user has a supported version of Ubuntu, but will be unable to upgrade. I asked this question a number of months ago and the suggestion was that the hope was that there would be an upgrade path from 12.10 to 14.04. Could somebody confirm whether this is still the case, or if not what the plans are for 12.10 users after 13.04 becomes EOL. Edited for clarification The particular issue I was concerned about is that once 13.04 goes EOL, a 12.10 user would in theory lose the ability to upgrade once the 13.04 repo's are removed from the normal release repository. Using the old releases method would be a way around the issue, however would make it more complicated for a less experienced user. An alternative could be for the 13.04 repo's to be left available for the 3 month interim period so that a 12.10 version could still be upgraded to 13.04 and subsequently onto 13.10, however that doesn't seem an optimal solution in that users may consider that it meant that support for 13.04 was being continued. If a direct upgrade from 12.10 to 14.04 was to made available, this would only be available once 14.04 was released and still leaves the issue of the 3 months between January and April 2014 were there may be some confusion. I suspect that its not going to affect a significant number of users, if somebody has upgraded from 12.04LTS to 12.10, in all probability, they'll have continued to upgrade to 13.04 and upwards because they'd made the choice to use current rather than LTS releases. It would just be useful to have some clarification of the situation which people can be referred to in advance of 13.04 going EOL rather than hitting the cut off point and it being too late for users to make the decision and being left in limbo.

    Read the article

  • Partner Webcast – More out of Database Appliance with DB Options - 13 September 2012

    - by Thanos
    The Oracle Database Appliance is a new way to take advantage of the world's most popular database—Oracle Database 11g —in a single, easy-to-deploy and manage system. It's a complete package of software, server, storage, and networking that's engineered for simplicity; saving time and money by simplifying deployment, maintenance, and support of database workloads. But that is not all, with the support for all Oracle Database Options, Oracle Database Appliance can be the ideal solution for many use cases. Feature Benefit Simplifies deployment, maintenance, and support of high-availability database workloads Saves significant time and effort throughout the database administration lifecycle An engineered system of software, server, storage, and networking High availability for a wide range of custom and packaged OLTP and data warehousing application databases Simple one-button Installation, full-stack integrated patching and diagnostics Reduces planned and unplanned downtime by automatically monitoring and logging service requests with Oracle Support Built using the world’s #1 database Protects databases from server and storage failures with Oracle Real Application Clusters and Automatic Storage Management Unique Pay-As-You-Grow software licensing Reduces cost with flexibility to adjust your software spend as your business grows without the need for any hardware upgrades Discover the Oracle Database Appliance Value Proposition and learn how to position and combine it with database options to capture new business and easily roll out solutions safely and with maximum cost efficiency. This webcast is repeated once again for your benefit. Agenda: Oracle Database& Engineered Systems Innovation. What’s the Oracle Database Appliance ? Oracle Database Appliance Value Proposition. Oracle Database Appliance with Database Options Oracle Database Appliance Partners Business Delivery FormatThis FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Duration: 1 hour Register Now! Oracle Database Appliance is available for purchase at the Oracle Store under Engineered Systems. For any questions please contact us at partner.imc-AT-beehiveonline.oracle-DOT-com Visit regularly our ISV Migration Center blog Or Follow us @oracleimc to learn more on Oracle Technologies as well as upcoming partner webcasts and events.

    Read the article

  • What's your experience with female programmers?

    - by Rachel
    Let me start by saying I'm female, but every single other female programmer I've known has been pretty terrible. The extent of their knowledge seems to be copy/paste and modify some values. Quite often they don't even try to learn new concepts, or understand what they're doing. I'm not saying good female programmers aren't out there, just that the ratio of good/bad programmers seems much worse then males. Perhaps its because everyone feels they have to give female programmers a chance to prove they are not biased? Or is this just me?? What has your experiences been with them? UPDATE: Just want to say thanks for all the responses. I've learned some interesting things and am happy to know that female programmers have such support :) My experience has been very limited with them but all bad, and I agree that it is probably due to my small sample size (around 5). I wasn't trying to be sexist with such a question, I just wanted to find out if it was really that abnormal to be a female programmer. I'm abnormal about a lot of things you'd expect from a female... I play video games in most of my spare time, I liked Math so much I completed my entire math book during christmas break one year (What can I say, I found the subject interesting), I'm not very social, I dislike shopping, I only have 2 pairs of shoes, my significant other doesn't work but does all the housework/laundry/etc... but anyways, thanks :)

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >