Search Results

Search found 12267 results on 491 pages for 'out of memory'.

Page 203/491 | < Previous Page | 199 200 201 202 203 204 205 206 207 208 209 210  | Next Page >

  • How many websites can my server potentially hold?

    - by Daniel Kindler
    Sorry for the "noob" question, but... About how many medium-sized websites with average traffic could this server hold? Just like the average website, kind of like a small business site. How many sites could this server hold, but still maintain nice, decent speed? PowerEdge R510 PE R510 Chassis for Up to Four 3.5" Cabled Hard Drives, LED edit Processor Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem edit Memory 8GB Memory (4x2GB), 1333MHz Single Ranked UDIMMs for 1 Procs, Optimized edit Operating System SUSE Linux Enterprise Server 10, SP3, Up To 32 CPU Lic, 1 YR Sub, DIB, Media edit Red Hat Enterprise Linux Licensing Hard Drives 250GB 7.2K RPM SATA 3.5" Cabled Hard Drive edit Hard Drives 1TB 7.2K RPM SATA 3.5" Cabled Hard Drive edit Hard Drives 2 X 2TB 7.2K RPM SATA 3.5in Cabled Hard Drive Hard Drive Configuration No RAID, Embedded SATA Controller for x4 Chassis edit Power Supply 480 Watt Non-Redundant Power Supply edit Thank you!

    Read the article

  • I know this is a stupid question but... How many websites can my server potentially hold?

    - by Daniel Kindler
    Sorry for the "noob" question, but... About how many medium-sized websites with average traffic could this server hold? Just like the average website, kind of like a small business site. How many sites could this server hold, but still maintain nice, decent speed? PowerEdge R510 PE R510 Chassis for Up to Four 3.5" Cabled Hard Drives, LED edit Processor Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem edit Memory 8GB Memory (4x2GB), 1333MHz Single Ranked UDIMMs for 1 Procs, Optimized edit Operating System SUSE Linux Enterprise Server 10, SP3, Up To 32 CPU Lic, 1 YR Sub, DIB, Media edit Red Hat Enterprise Linux Licensing Hard Drives 250GB 7.2K RPM SATA 3.5" Cabled Hard Drive edit Hard Drives 1TB 7.2K RPM SATA 3.5" Cabled Hard Drive edit Hard Drives 2 X 2TB 7.2K RPM SATA 3.5in Cabled Hard Drive Hard Drive Configuration No RAID, Embedded SATA Controller for x4 Chassis edit Power Supply 480 Watt Non-Redundant Power Supply edit Thank you!

    Read the article

  • Winlogon Application Error

    - by IT_07
    This error is happening to 6 out of 32 computers. I have created a base image from scratch using Ghost Enterprise, but error still shows on the same machines. It happens at the log on prompt and after doing a soft reset the message goes away, but it comes back eventually. -Error message- The Instruction at "0X74eF400e" referenced Memory at "0X00000000" The memory could not be "Written". Winlogon.exe - Application Error Any reason why this is happening? I have tried to run a memTest but everything shows okay.

    Read the article

  • Is it possible to group chrome extension processes?

    - by Shajirr
    I have a problem with Chrome - most extensions, even those which consume merely 5-10 MB of memory, each have their own process, and because of that Chrome uses a single process for all the tabs, which consume a lot more memory compared to extensions, even with --proccess-per-tab switch. This behavior seemes illogical - why do you need extensions in separate processes if you can't use your browser properly when it takes 5-10 seconds just to load a tab and freezes constantly? Is it possible somehow to limit the number of processes which can be used for extensions, maybe group them to 10 extensions per 1 process?

    Read the article

  • Extensive use of HDD after VmWare Player virtual machine is closed.

    - by Bobrovsky
    Each time I close virtual machine in VmWare Player I see extensive use of HDD in my system. Basically, whole system becomes unresponsive for about 5-7 minutes. Host system is Windows 7 Utimate x64 SP1 with 6 GB of memory, i3-M350 processor. Virtual machine is Windows XP SP3 x86 (2GB of memory allocated for VM). What can be the cause and what can I do to solve the issue? UPDATE: I am not shutting down the VM, I just close Player window and VM saves it's state. System becomes unresponsive right after VM have saved it state (as indicated by Player) and Player itself have closed.

    Read the article

  • How to choose NoSQL database engine?

    - by Poma
    We have a database with following specs: 30k records, 7mb in size 20 inserts/second 1000 updates/second 1000 range selects/second, by secondary index, approx 10 rows each needs at least one secondary index needs some mechanism to expire keys if they are not updated for 75 secs (can be done via programmatic garbage collector but will require additional 'last_update' index and will add some load) consistency is not required durability is not required db should be stored in memory For now we use Redis, but it does not have secondary index and it's keys index:foo:* is too slow. Membase also does not have secondary index (as far as I know). MongoDB and MySQL memory engine have table-level locks. What engine will fit our use case?

    Read the article

  • CPU Configuration Issue for 2 Servers (Server 2008 R2)

    - by Bill Moreland
    I have 2 servers running the exact same Classic ASP code with Access DBs (yes, not ideal, but it is what it is, for now). 1) Xeon 5520 @ 2.27 GHz (6 GB Memory) 2) Xeon E5-2620 @ 2.00 GHz (2 processors, 32 GB Memory) For most pages the newer E5-2620 processes the pages between 10-15% faster. On pages requiring heavy and/or multiple complicated access stored procedures (queries) the older 5520 does a much better job. I believe the servers are configured nearly identically. My question: is it possible that the newer, multi-processor server is not as good at handling Classic ASP as the older single processor? Is there a configuration difference that needs to be in place that I'm missing since I'm shooting for identical implementations?

    Read the article

  • Can I rent exclusive time on a powerful server running linux? [closed]

    - by Mark Borgerding
    My company is involved in a proposal that requires speed estimates of our software on a server with the latest & greatest processors. This is not the first time we've been in this situation. The servers themselves are too expensive to buy a new one every time, so we end up extrapolating from what we have. There are so many variables: processor generation & speed, memory speed, memory channels, cache configurations; it makes extrapolation difficult and error-prone. Is there a business that rents time on the newest servers? At least part of the time we'd need exclusive access to an otherwise quiescent system either via ssh shell access or unattended batch jobs. I am not looking for general cloud computing services. I don't need much time on the server, but it needs to be exclusive. And the server needs to be pretty cutting edge for a solid basis of estimate.

    Read the article

  • What factors can affect performance of Http Server written in C-Sharp? [on hold]

    - by Yousaf
    I am having trouble in terms of handling huge databases. I have multiple clients like 100-300 (clients are basically servers with i.e windows sql). Each client may have 38 thousand rows/listing of data, each row has 10-12 fields. I cannot afford to have json files of each client and than handle them on main server, because of memory issue. What if i have http server written in c or c# installed on clients and they return 250 rows in each response to the main server. How the factors like speed, memory or other issues can effect us ? What exactly I am asking for ? In short words if a server writter in c-sharp sends 250 rows per request. What factors can effect the performance of server ? for example. Speed, processing, Operating system, Implementation of algorithm of server ? How these factors can really effect the performance on large scale?

    Read the article

  • In a virtual machine monitor such as VMware’s ESXi Server, how are shadow page tables implemented?

    - by ali01
    My understanding is that VMMs such as VMware's ESXi Server maintain shadow page tables to map virtual page addresses of guest operating systems directly to machine (hardware) addresses. I've been told that shadow page tables are then used directly by the processor's paging hardware to allow memory access in the VM to execute without translation overhead. I would like to understand a bit more about how the shadow page table mechanism works in a VMM. Is my high level understanding above correct? What kind of data structures are used in the implementation of shadow page tables? What is the flow of control from the guest operating system all the way to the hardware? How are memory access translations made for a guest operating system before its shadow page table is populated? How is page sharing supported? Short of straight up reading the source code of an open source VMM, what resources can I look into to learn more about hardware virtualization?

    Read the article

  • (How) does deleting open files on Linux and a FAT file system work?

    - by lxgr
    It's clear to me how deleting open files works on filesystems that use inodes - unlink() just decreases the link count to zero, and when the last file handle to the file is closed, the inode will be removed. But how does it work when using a file system that doesn't use inodes, like FAT32, with Linux? Some experiments suggest that deleting open files is still possible (unlike on Windows, where the unlink call wouldn't succeed), but what happens when the file system is uncleanly unmounted? How does Linux mark the files as unlinked, when the file system itself doesn't support such an operation? Is the directory entry just deleted, but retained in memory (that would guarantee deletion after unmounting in any case, but would leave the file system in an inconsistent state), or will the deletion only be marked in memory, and written at the time the last file handle is closed, avoiding possible corruption, but restoring the deleted files after an unclean unmount?

    Read the article

  • intermittent SSH with ssh_exchange_identification error

    - by rafamvc
    My ssh connection to my server works every 30 min for around 10 min. Things that I figure out that might be the problem: The server is underload (it is a database server), but on those spare moments that I can connect, it is still under the same load, which doesnt make sense. The server is a ubuntu, and the consolekit was using a lot of virtual memory. Restarted the consolekit and it seems to be using a right amount of memory now. It is not the host alows or deny. Those are setup properly. It is not a firewall problem. Those settings were working and the same settings are working for other similar machines. It is on the ec2. Amazon cloud.

    Read the article

  • ATI HD 5970 display

    - by user55406
    Hello everyone i'm a little worry about my graphic card. before I get into that I will tell you my system specs. I have intel dx58so mainboard, 6gb corsair xms ddr3 ram, intel i7 960 cpu, and ati hd 5970. I'm also using a coolmaster haf92 case. My OS is vista X64. Here is my issue when I type in dxdiag in (Start Search) and I look into display. I see in (approx total memory) 716mb graphic memory. The ati hd 5970 is a 2gb graphic card. Am I being stuipd or is there an issue.

    Read the article

  • SQLAuthority News – A Successful Performance Tuning Seminar at Pune – Dec 4-5, 2010

    - by pinaldave
    This is report to my third of very successful seminar event on SQL Server Performance Tuning. SQL Server Performance Tuning Seminar in Colombo was oversubscribed with total of 35 attendees. You can read the details over here SQLAuthority News – SQL Server Performance Optimizations Seminar – Grand Success – Colombo, Sri Lanka – Oct 4 – 5, 2010. SQL Server Performance Tuning Seminar in Hyderabad was oversubscribed with total of 25 attendees. You can read the details over here SQL SERVER – A Successful Performance Tuning Seminar – Hyderabad – Nov 27-28, 2010. The same Seminar was offered in Pune on December 4,-5, 2010. We had another successful seminar with lots of performance talk. This seminar was attended by 30 attendees. The best part of the seminar was that along with the our agenda, we have talked about following very interesting concepts. Deadlocks Detection and Removal Dynamic SQL and Inline Code SQL Optimizations Multiple OR conditions and performance tuning Dynamic Search Condition Building and Improvement Memory Cache and Improvement Bottleneck Detections – Memory, CPU and IO Beginning Performance Tuning on Production Parametrization Improving already Super Fast Queries Convenience vs. Performance Proper way to create Indexes Hints and Disadvantages I had great time doing the seminar and sharing my performance tricks with all. The highlight of this seminar was I have explained the attendees, how I begin doing performance tuning when I go for Performance Tuning Consultations.   Pinal Dave at SQL Performance Tuning Seminar SQL Server Performance Tuning Seminar Pinal Dave at SQL Performance Tuning Seminar Pinal Dave at SQL Performance Tuning Seminar SQL Server Performance Tuning Seminar SQL Server Performance Tuning Seminar This seminar series are 100% demo oriented and no usual PowerPoint talk. They are created from my experiences of various organizations for performance tuning. I am not planning any more seminar this year as it was great but I am booked currently for next 60 days at various performance tuning engagements. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, SQLAuthority News, T SQL, Technology

    Read the article

  • How do I install a driver for an Atheros AR9285?

    - by Fernando
    How to install the driver for Atheros AR9285 in Ubuntu 11.10. Still no package for 11.10 according to this https://help.ubuntu.com/community/WifiDocs/Device/Atheros/AR9285 Here is the output of the commands marc@fer-VPCYA1V9E:~$ sudo lshw -class network *-network DISABLED description: Wireless interface product: AR9285 Wireless Network Adapter (PCI-Express) vendor: Atheros Communications Inc. physical id: 0 bus info: pci@0000:02:00.0 logical name: wlan0 version: 01 serial: 4c:0f:6e:d6:65:cc width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=ath9k driverversion=3.0.0-12-generic firmware=N/A latency=0 link=no multicast=yes wireless=IEEE 802.11bgn resources: irq:16 memory:d3400000-d340ffff *-network description: Ethernet interface product: AR8131 Gigabit Ethernet vendor: Atheros Communications physical id: 0 bus info: pci@0000:03:00.0 logical name: eth0 version: c0 serial: 54:42:49:a2:1f:bc capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm msi pciexpress vpd bus_master cap_list ethernet physical tp 10bt 10bt-fd 100bt 100bt-fd 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=atl1c driverversion=1.0.1.0-NAPI firmware=N/A latency=0 link=no multicast=yes port=twisted pair resources: irq:43 memory:d2400000-d243ffff ioport:1000(size=128) And the second command marc@fer-VPCYA1V9E:~$ rfkill list 0: sony-wifi: Wireless LAN Soft blocked: no Hard blocked: no 1: sony-bluetooth: Bluetooth Soft blocked: no Hard blocked: no 2: phy0: Wireless LAN Soft blocked: no Hard blocked: no 3: hci0: Bluetooth Soft blocked: no Hard blocked: no 4: acer-wireless: Wireless LAN Soft blocked: yes Hard blocked: no Is there a way to make it work?

    Read the article

  • SQL SERVER – Configure Management Data Collection in Quick Steps – T-SQL Tuesday #005

    - by pinaldave
    This article was written as a response to T-SQL Tuesday #005 – Reporting. The three most important components of any computer and server are the CPU, Memory, and Hard disk specification. This post talks about  how to get more details about these three most important components using the Management Data Collection. Management Data Collection generates the reports for the three said components by default. Configuring Data Collection is a very easy task and can be done very quickly. Please note: There are many different ways to get reports generated for CPU, Memory and IO. You can use DMVs, Extended Events as well Perfmon to trace the data. Keeping the T-SQL Tuesday subject of reporting this post is created to give visual tutorial to quickly configure Data Collection and generate Reports. From Book On-Line: The data collector is a core component of the Data Collection platform for SQL Server 2008 and the tools that are provided by SQL Server. The data collector provides one central point for data collection across your database servers and applications. This collection point can obtain data from a variety of sources and is not limited to performance data, unlike SQL Trace. Let us go over the visual tutorial on how quickly Data Collection can be configured. Expand the management node under the main server node and follow the direction in the pictures. This reports can be exported to PDF as well Excel by writing clicking on reports. Now let us see more additional screenshots of the reports. The reports are very self-explanatory  but can be drilled down to get further details. Click on the image to make it larger. Well, as we can see, it is very easy to configure and utilize this tool. Do you use this tool in your organization? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL Reporting, SQL Reports

    Read the article

  • Downloading a file over HTTP the SSIS way

    This post shows you how to download files from a web site whilst really making the most of the SSIS objects that are available. There is no task to do this, so we have to use the Script Task and some simple VB.NET or C# (if you have SQL Server 2008) code. Very often I see suggestions about how to use the .NET class System.Net.WebClient and of course this works, you can code pretty much anything you like in .NET. Here I’d just like to raise the profile of an alternative. This approach uses the HTTP Connection Manager, one of the stock connection managers, so you can use configurations and property expressions in the same way you would for all other connections. Settings like the security details that you would want to make configurable already are, but if you take the .NET route you have to write quite a lot of code to manage those values via package variables. Using the connection manager we get all of that flexibility for free. The screenshot below illustrate some of the options we have. Using the HttpClientConnection class makes for much simpler code as well. I have demonstrated two methods, DownloadFile which just downloads a file to disk, and DownloadData which downloads the file and retains it in memory. In each case we show a message box to note the completion of the download. You can download a sample package below, but first the code: Imports System Imports System.IO Imports System.Text Imports System.Windows.Forms Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() ' Get the unmanaged connection object, from the connection manager called "HTTP Connection Manager" Dim nativeObject As Object = Dts.Connections("HTTP Connection Manager").AcquireConnection(Nothing) ' Create a new HTTP client connection Dim connection As New HttpClientConnection(nativeObject) ' Download the file #1 ' Save the file from the connection manager to the local path specified Dim filename As String = "C:\Temp\Sample.txt" connection.DownloadFile(filename, True) ' Confirm file is there If File.Exists(filename) Then MessageBox.Show(String.Format("File {0} has been downloaded.", filename)) End If ' Download the file #2 ' Read the text file straight into memory Dim buffer As Byte() = connection.DownloadData() Dim data As String = Encoding.ASCII.GetString(buffer) ' Display the file contents MessageBox.Show(data) Dts.TaskResult = Dts.Results.Success End Sub End Class Sample Package HTTPDownload.dtsx (74KB)

    Read the article

  • Oracle Announces Oracle Big Data Appliance X3-2 and Enhanced Oracle Big Data Connectors

    - by jgelhaus
    Enables Customers to Easily Harness the Business Value of Big Data at Lower Cost Engineered System Simplifies Big Data for the Enterprise Oracle Big Data Appliance X3-2 hardware features the latest 8-core Intel® Xeon E5-2600 series of processors, and compared with previous generation, the 18 compute and storage servers with 648 TB raw storage now offer: 33 percent more processing power with 288 CPU cores; 33 percent more memory per node with 1.1 TB of main memory; and up to a 30 percent reduction in power and cooling Oracle Big Data Appliance X3-2 further simplifies implementation and management of big data by integrating all the hardware and software required to acquire, organize and analyze big data. It includes: Support for CDH4.1 including software upgrades developed collaboratively with Cloudera to simplify NameNode High Availability in Hadoop, eliminating the single point of failure in a Hadoop cluster; Oracle NoSQL Database Community Edition 2.0, the latest version that brings better Hadoop integration, elastic scaling and new APIs, including JSON and C support; The Oracle Enterprise Manager plug-in for Big Data Appliance that complements Cloudera Manager to enable users to more easily manage a Hadoop cluster; Updated distributions of Oracle Linux and Oracle Java Development Kit; An updated distribution of open source R, optimized to work with high performance multi-threaded math libraries Read More   Data sheet: Oracle Big Data Appliance X3-2 Oracle Big Data Appliance: Datacenter Network Integration Big Data and Natural Language: Extracting Insight From Text Thomson Reuters Discusses Oracle's Big Data Platform Connectors Integrate Hadoop with Oracle Big Data Ecosystem Oracle Big Data Connectors is a suite of software built by Oracle to integrate Apache Hadoop with Oracle Database, Oracle Data Integrator, and Oracle R Distribution. Enhancements to Oracle Big Data Connectors extend these data integration capabilities. With updates to every connector, this release includes: Oracle SQL Connector for Hadoop Distributed File System, for high performance SQL queries on Hadoop data from Oracle Database, enhanced with increased automation and querying of Hive tables and now supported within the Oracle Data Integrator Application Adapter for Hadoop; Transparent access to the Hive Query language from R and introduction of new analytic techniques executing natively in Hadoop, enabling R developers to be more productive by increasing access to Hadoop in the R environment. Read More Data sheet: Oracle Big Data Connectors High Performance Connectors for Load and Access of Data from Hadoop to Oracle Database

    Read the article

  • Web application development over C++ development..

    - by learnerforever
    Hi, I am CS undergrad and CS grad. In college I used to program in C/C++/java and have pretty much stuck to the same skill set in industry with 3 years experience. I like thinking,reading,applying logic etc, designing data structures, but I have little patience with debugging large C++ code. And having to deal with low level stuff like memory fault,memory corruption,compilation/linking issues. My confidence in programming is getting down due to this, but I like being in technical field. Does web application development like LAMP suit (Linux,apache,mysql,php),CSS,scripting (AMONG OTHER WEB DEVELOPMENT RELATED SKILLS) etc need lesser patience with debugging,and understanding of low level stuff, but your analysis/logical skills also get used? Also opportunities in web application development look more. Things like scalability, most of the stuff that Google does fascinates me, but for patience needed for dealing with C++ debugging. I make blunders while coding. How does the field look like outside C++? I am beginning to wonder if as a female, by moving to web application development, I can better manage work life balance. I have seen relatively lesser females in C++ than in Java/.net. Not very sure about web related stuff though. Also, what are the other hot technologies being used in web application development? lamp,css is something I know vaguely. Not in touch with keywords going on in this area. Please help!!.

    Read the article

  • Downloading a file over HTTP the SSIS way

    This post shows you how to download files from a web site whilst really making the most of the SSIS objects that are available. There is no task to do this, so we have to use the Script Task and some simple VB.NET or C# (if you have SQL Server 2008) code. Very often I see suggestions about how to use the .NET class System.Net.WebClient and of course this works, you can code pretty much anything you like in .NET. Here I’d just like to raise the profile of an alternative. This approach uses the HTTP Connection Manager, one of the stock connection managers, so you can use configurations and property expressions in the same way you would for all other connections. Settings like the security details that you would want to make configurable already are, but if you take the .NET route you have to write quite a lot of code to manage those values via package variables. Using the connection manager we get all of that flexibility for free. The screenshot below illustrate some of the options we have. Using the HttpClientConnection class makes for much simpler code as well. I have demonstrated two methods, DownloadFile which just downloads a file to disk, and DownloadData which downloads the file and retains it in memory. In each case we show a message box to note the completion of the download. You can download a sample package below, but first the code: Imports System Imports System.IO Imports System.Text Imports System.Windows.Forms Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() ' Get the unmanaged connection object, from the connection manager called "HTTP Connection Manager" Dim nativeObject As Object = Dts.Connections("HTTP Connection Manager").AcquireConnection(Nothing) ' Create a new HTTP client connection Dim connection As New HttpClientConnection(nativeObject) ' Download the file #1 ' Save the file from the connection manager to the local path specified Dim filename As String = "C:\Temp\Sample.txt" connection.DownloadFile(filename, True) ' Confirm file is there If File.Exists(filename) Then MessageBox.Show(String.Format("File {0} has been downloaded.", filename)) End If ' Download the file #2 ' Read the text file straight into memory Dim buffer As Byte() = connection.DownloadData() Dim data As String = Encoding.ASCII.GetString(buffer) ' Display the file contents MessageBox.Show(data) Dts.TaskResult = Dts.Results.Success End Sub End Class Sample Package HTTPDownload.dtsx (74KB)

    Read the article

  • Book Review &ndash; Developer&rsquo;s Guide To Collections in Microsoft&reg; .NET

    - by Lori Lalonde
    Developer’s Guide To Collections in Microsoft® .NET, by Calvin Janes, discusses the various collections available in the built-in NET libraries, as well as  the advantages and disadvantages of using each type of collection. Other areas are also covered including how collections utilize memory, how to use LINQ with collections, using threading with collections, serializing collections, and how to bind collections to controls in Windows Forms, WPF and Silverlight. For developers looking for a simple reference book on collections, then this book will serve that purpose and serve it well. For those looking for a great read from cover-to-cover, they may be disappointed. This book tends to be repetitive in discussion topics, examples, and code samples in the first two parts of the book. In the first part, the author conducts walk-throughs to develop custom collections. In  the second part, the author conducts walk-throughs on using the built-in .NET collections. For experienced .NET developers, the first two parts will not provide much value. However, it is beneficial for new developers who have not worked with the built-in collections in .NET. They will obtain an understanding of the mechanics of the built-in collections and how memory is utilized when using the various types of collections. So in this aspect, new developers will get more value out of this book. The third and fourth parts delve into advanced topics, including using LINQ, threading, serialization and data binding. I find these two parts of the book are well written and flow better than the first two parts. Both beginner and experienced developers will find value in this half of the book, mainly on the topics of threading and serialization. The eBook format of this book was provided free through O'Reilly's Blogger Review program. This book can be purchased from the O'Reilly book store at: http://shop.oreilly.com/product/0790145317193.do

    Read the article

  • Realtek Semiconductor Co., Ltd. RTL8101E/RTL8102E

    - by Sebastian Bugiu
    I had Ubuntu 11.10 and in the last few weeks I experienced an obscure problem: after I had the computer running for a few days I could no longer connect to google.com or anything related to google. All sites worked with all browsers (Firefox, chrome, opera) except google. It remained in the connecting phase for a few minutes and either timed out or finally connected with this huge delay. Even if I entered other sites such as this one, if it had anything to do with google such adsense or gstatic or whatever with g in it, that site took a long time to load waiting in connecting to gstatic.com . Anything google related took minutes to work, but everything else worked instantly! I tried rebooting or using other machine(with windows on it) and this worked, so it's not network related. But after a few days it started not working again... So I upgraded to the Precise Pangolin hoping this behavior would go away. It didn't! After a few days I get the same behavior as in 11.10. What am I supposed to do? Reboot every other day? I didn't have this problem with neither 10.10 or 11.04. I found the Realtek RTL8168/8111E issue with the r8169 driver but this is not exactly the same card so probably trying r8168 won't help. Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8101E/RTL8102E PCI Express Fast Ethernet controller (rev 02) Subsystem: Toshiba America Info Systems Device ff1c Flags: bus master, fast devsel, latency 0, IRQ 44 I/O ports at 4000 [size=256] Memory at d0010000 (64-bit, prefetchable) [size=4K] Memory at d0000000 (64-bit, prefetchable) [size=64K] Capabilities: [40] Power Management version 7 Capabilities: [50] MSI: Enable+ Count=1/1 Maskable- 64bit+ Capabilities: [70] Express Endpoint, MSI 01 Capabilities: [ac] MSI-X: Enable- Count=2 Masked- Capabilities: [cc] Vital Product Data Capabilities: [100] Advanced Error Reporting Capabilities: [140] Virtual Channel Capabilities: [160] Device Serial Number 09-00-00-00-ff-ff-00-00 Kernel driver in use: r8169 Kernel modules: r8169

    Read the article

  • Extending ASP.NET Output Caching

    One of the most sure-fire ways to improve a web application's performance is to employ caching. Caching takes some expensive operation and stores its results in a quickly accessible location. Since it's inception, ASP.NET has offered two flavors of caching: Output Caching - caches the entire rendered markup of an ASP.NET page or User Control for a specified duration.Data Caching - a API for caching objects. Using the data cache you can write code to add, remove, and retrieve items from the cache.Until recently, the underlying functionality of these two caching mechanisms was fixed - both cached data in the web server's memory. This has its drawbacks. In some cases, developers may want to save output cache content to disk. When using the data cache you may want to cache items to the cloud or to a distributed caching architecture like memcached. The good news is that with ASP.NET 4 and the .NET Framework 4, the output caching and data caching options are now much more extensible. Both caching features are now based upon the provider model, meaning that you can create your own output cache and data cache providers (or download and use a third-party or open source provider) and plug them into a new or existing ASP.NET 4 application. This article focuses on extending the output caching feature. We'll walk through how to create a custom output cache provider that caches a page or User Control's rendered output to disk (as opposed to memory) and then see how to plug the provider into an ASP.NET application. A complete working example, available in both VB and C#, is available for download at the end of this article. Read on to learn more! Read More >

    Read the article

  • Extending ASP.NET Output Caching

    One of the most sure-fire ways to improve a web application's performance is to employ caching. Caching takes some expensive operation and stores its results in a quickly accessible location. Since it's inception, ASP.NET has offered two flavors of caching: Output Caching - caches the entire rendered markup of an ASP.NET page or User Control for a specified duration.Data Caching - a API for caching objects. Using the data cache you can write code to add, remove, and retrieve items from the cache.Until recently, the underlying functionality of these two caching mechanisms was fixed - both cached data in the web server's memory. This has its drawbacks. In some cases, developers may want to save output cache content to disk. When using the data cache you may want to cache items to the cloud or to a distributed caching architecture like memcached. The good news is that with ASP.NET 4 and the .NET Framework 4, the output caching and data caching options are now much more extensible. Both caching features are now based upon the provider model, meaning that you can create your own output cache and data cache providers (or download and use a third-party or open source provider) and plug them into a new or existing ASP.NET 4 application. This article focuses on extending the output caching feature. We'll walk through how to create a custom output cache provider that caches a page or User Control's rendered output to disk (as opposed to memory) and then see how to plug the provider into an ASP.NET application. A complete working example, available in both VB and C#, is available for download at the end of this article. Read on to learn more! Read More >Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Is recursion really bad?

    - by dotneteer
    After my previous post about the stack space, it appears that there is perception from the feedback that recursion is bad and we should avoid deep recursion. After writing a compiler, I know that the modern computer and compiler are complex enough and one cannot automatically assume that a hand crafted code would out-perform the compiler optimization. The only way is to do some prototype to find out. So why recursive code may not perform as well? Compilers place frames on a stack. In additional to arguments and local variables, compiles also need to place frame and program pointers on the frame, resulting in overheads. So why hand-crafted code may not performance as well? The stack used by a compiler is a simpler data structure and can grow and shrink cleanly. To replace recursion with out own stack, our stack is allocated in the heap that is far more complicated to manage. There could be overhead as well if the compiler needs to mark objects for garbage collection. Compiler also needs to worry about the memory fragmentation. Then there is additional complexity: CPUs have registers and multiple levels of cache. Register access is a few times faster than in-CPU cache access and is a few 10s times than on-board memory access. So it is up to the OS and compiler to maximize the use of register and in-CPU cache. For my particular problem, I did an experiment to rewrite my c# version of recursive code with a loop and stack approach. So here are the outcomes of the two approaches:   Recursive call Loop and Stack Lines of code for the algorithm 17 46 Speed Baseline 3% faster Readability Clean Far more complex So at the end, I was able to achieve 3% better performance with other drawbacks. My message is never assuming your sophisticated approach would automatically work out better than a simpler approach with a modern computer and compiler. Gage carefully before committing to a more complex approach.

    Read the article

< Previous Page | 199 200 201 202 203 204 205 206 207 208 209 210  | Next Page >