Search Results

Search found 407 results on 17 pages for 'jackson davis'.

Page 5/17 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Installing Ubuntu Server 11.04

    - by Jackson Walters
    Trying to install Ubuntu Server 11.04 on an 64-bit AMD machine. The install goes fine until after "Loading additional components", then I get a blank purple screen with a responsive cursor at the bottom. I've looked everywhere and can't find anything (all the documentation seems to be on the desktop version). I have a wireless card installed if that makes a difference, don't why it would. This happens regardless of whether nomodeset is set. Here's what it looks like (as described): Specs: CPU - 64-bit AMD Sempron 140 @ 2.7Ghz RAM - 1Gb DDR3 1333 HDD - WD 500Gb 7200RPM SATA 3.0Gb/s Mobo - Foxconn M61PMP-K Wireless Card - Rosewill RNX-G300LX

    Read the article

  • On the art of self-promotion

    - by Tony Davis
    I attended Brent Ozar's Building the Fastest SQL Servers session at Tech Ed last week, and found myself engulfed in a 'perfect storm' of excellent technical and presentational skills coupled with an astute awareness of the value of promoting one's work. I spend a lot of time at such events talking to developers and DBAs about the value of blogging and writing articles, and my impression is that some could benefit from a touch less modesty and a little more self-promotion. I sense a reticence in many would-be writers. Is what I have to say important enough? Haven't far more qualified and established commentators, MVPs and so on, already said it? While it's a good idea to pick reasonably fresh and interesting topics, it's more important not to let such fears lead to writer's block. In the eyes of any future employer, your published writing is an extension of your resume. They will not care that a certain MVP knows how to solve problem x, but they will be very interested to see that you have tackled that same problem, and solved it in your own way, and described the process in your own voice. In your current job, your writing is one of the ways you can express to your peers, and to the organization as a whole, the value of what you contribute. Many Developers and DBAs seem to rely on the idea that their work will speak for itself, and that their skill shines out from it. Unfortunately, this isn't always true. Many Development DBAs, for example, will be painfully aware of the massive effort involved in tuning and adding resilience to rapidly developed applications. However, others in the organization who are unaware of what's involved in getting an application that is 'done' ready for production may dismiss such efforts as fussiness or conservatism. At the dark end of the development cycle, chickens come home to roost, but their droppings tend to land on those trying to clear up the mess. My advice is this: next time you fix a bug or improve the resilience or performance of a database or application, make sure that you use team meetings, informal discussions and so on to ensure that people understand what the problem was and what you had to do to fix it. Use your blog to describe, generally, the process you adopted, the resources you used and the insights that came from your work. Encourage your colleagues to do the same. By spreading the art of self-promotion to everyone involved in an IT project, we get a better idea of the extent of the work and the value of the contribution of all the team members. As always, we'd love to hear what you think. This very week, Simple-talk launches its new blogging platform. If any of this has moved you to 'throw your hat into the ring', drop us a mail at [email protected]. Cheers, Tony.

    Read the article

  • Data Model Dissonance

    - by Tony Davis
    So often at the start of the development of database applications, there is a premature rush to the keyboard. Unless, before we get there, we’ve mapped out and agreed the three data models, the Conceptual, the Logical and the Physical, then the inevitable refactoring will dog development work. It pays to get the data models sorted out up-front, however ‘agile’ you profess to be. The hardest model to get right, the most misunderstood, and the one most neglected by the various modeling tools, is the conceptual data model, and yet it is critical to all that follows. The conceptual model distils what the business understands about itself, and the way it operates. It represents the business rules that govern the required data, its constraints and its properties. The conceptual model uses the terminology of the business and defines the most important entities and their inter-relationships. Don’t assume that the organization’s understanding of these business rules is consistent or accurate. Too often, one department has a subtly different understanding of what an entity means and what it stores, from another. If our conceptual data model fails to resolve such inconsistencies, it will reduce data quality. If we don’t collect and measure the raw data in a consistent way across the whole business, how can we hope to perform meaningful aggregation? The conceptual data model has more to do with business than technology, and as such, developers often regard it as a worthy but rather arcane ceremony like saluting the flag or only eating fish on Friday. However, the consequences of getting it wrong have a direct and painful impact on many aspects of the project. If you adopt a silo-based (a.k.a. Domain driven) approach to development), you are still likely to suffer by starting with an incomplete knowledge of the domain. Even when you have surmounted these problems so that the data entities accurately reflect the business domain that the application represents, there are likely to be dire consequences from abandoning the goal of a shared, enterprise-wide understanding of the business. In reading this, you may recall experiences of the consequence of getting the conceptual data model wrong. I believe that Phil Factor, for example, witnessed the abandonment of a multi-million dollar banking project due to an inadequate conceptual analysis of how the bank defined a ‘customer’. We’d love to hear of any examples you know of development projects poleaxed by errors in the conceptual data model. Cheers, Tony

    Read the article

  • Why won't my Broadcom BCM4312 LP-PHY work with the STA driver?

    - by Jackson Taylor
    I tried the steps here for a 4312: https://help.ubuntu.com/community/WifiDocs/Driver/bcm43xx Both of these: sudo modprobe -r b43 ssb wl sudo modprobe wl return: FATAL: Module wl not found. FATAL: Error running install command for wl (this one is only for the second one actually) I tried the broadcom-sta, didn't work. What's confusing is down below in the next steps for STA with internet access it says to use the bcmwl one. So I install that and it succeeds but with some errors: sudo apt-get install bcmwl-kernel-source Reading package lists... Done Building dependency tree Reading state information... Done The following package was automatically installed and is no longer required: module-assistant Use 'apt-get autoremove' to remove it. The following NEW packages will be installed: bcmwl-kernel-source 0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded. Need to get 0 B/1,181 kB of archives. After this operation, 3,609 kB of additional disk space will be used. Selecting previously unselected package bcmwl-kernel-source. (Reading database ... 168005 files and directories currently installed.) Unpacking bcmwl-kernel-source (from .../bcmwl-kernel-source_5.100.82.112+bdcom-0ubuntu3_amd64.deb) ... Setting up bcmwl-kernel-source (5.100.82.112+bdcom-0ubuntu3) ... Loading new bcmwl-5.100.82.112+bdcom DKMS files... Building only for 3.5.0-21-generic Building for architecture x86_64 Module build for the currently running kernel was skipped since the kernel source for this kernel does not seem to be installed. ERROR: Module b43 does not exist in /proc/modules ERROR: Module b43legacy does not exist in /proc/modules ERROR: Module ssb does not exist in /proc/modules ERROR: Module bcm43xx does not exist in /proc/modules ERROR: Module brcm80211 does not exist in /proc/modules ERROR: Module brcmfmac does not exist in /proc/modules ERROR: Module brcmsmac does not exist in /proc/modules ERROR: Module bcma does not exist in /proc/modules FATAL: Module wl not found. FATAL: Error running install command for wl update-initramfs: deferring update (trigger activated) Processing triggers for initramfs-tools ... update-initramfs: Generating /boot/initrd.img-3.5.0-21-generic jtaylor991@jtaylor991-whiteHP:~$ sudo modprobe wl FATAL: Module wl not found. FATAL: Error running install command for wl Then I do the modprobe wl commands listed above and it gives the above listed errors. It didn't work with the broadcom-sta driver either. I installed the b43 ones but nothing happened, and I don't know why so those are still installed. firmware-b43legacy-installer, b43-fwcutter and firmware-b43-lpphy-installer (yes it is a LP-PHY) are currently installed. If I go into System Settings Software Sources Additional Drivers it says "Using Broadcom 802.11 Linux STA wireless driver source from bcmwl-kernel-source (proprietary) But bcmwl-kernel-source isn't installed. I could try again but I remember rebooting and it still said this. What's funny is it found wireless networks during the Ubuntu setup/installation, I don't remember if I got it to connect or not though. I think it kept asking for a password when I put it in (yes it was right I showed password and looked at it) so I just ignored it. But right now the Enable Wireless option in the top right is just gone, it's just Enable Networking and I'm on ethernet on this HP Pavilion dv4-1435dx right here. If I run rfkill list it shows: 0: hp-wifi: Wireless LAN Soft blocked: no Hard blocked: no It was hard blocked at the beginning but unblocking it makes no change. Also it's a touch sensitive button, and it appears to be always orange no matter if it's enabled or not because when I touch it the hard blocked changes between yes and no in rfkill list. I think it was blue for a minute at one point. What is going on?!?! Help me! Lol, thanks for any and all of your time guys. Oh yeah this is Ubuntu 12.10 fresh install.

    Read the article

  • Where have the Direct3D 11 tutorials on MSDN have gone?

    - by Cam Jackson
    I've had this tutorial bookmarked for ages. I've just decided to give DX11 a real go, so I've gone through that tutorial, but I can't find where the next one in the series is! There are no links from that page to either the next in the series, or back up to the table of contents that lists all of the tutorials. These are just companion tutorials to the samples that come with the SDK, but I find them very helpful. Searching MSDN from google and the MSDN Bing search box has turned up nothing, it's like they've removed all links to these tutorials, but the pages are still there if you have the URLs. Unfortunately, MSDN URLs are akin to youtube URLs, so I can't just guess the URL of the next tutorial. Anyone have any idea what happened to these tutorials, or how I can find the others?

    Read the article

  • 12.10 no audio via hdmi and video speeds up

    - by jackson
    I have a laptop with an ati radeon 4200, on 12.04 everything worked fine, since upgrading to 12.10 I cannot get sound over the hdmi. When I switch to hdmi audio the video speeds up to about 2x. I can use the speakers in my laptop and watch video via hdmi with no problems. Things I have tried: Various tutorials to install the AMD/ATI drivers, all of which resulted in low graphics mode. Checked that everything is properly set in alsamixer, the sound utility and - installed pavucontrol and checked everything in there. Verified the output from cat /proc/asound/cards looks normal When I initially upgraded there was a plethora of problems which I believe were due to the old proprietary driver still being used but not compatible, after a few hours trying to fix that I decided just to back up and do a fresh install which works great except for the above stated problem. Any help would be greatly appreciated!! Finally hopefully this hasn't already been answered, I have tried a few different searches on the boards and haven't come up with anything. $ aplay -l **** List of PLAYBACK Hardware Devices **** card 0: SB [HDA ATI SB], device 0: ALC269VB Analog [ALC269VB Analog] Subdevices: 1/1 Subdevice #0: subdevice #0 card 1: HDMI [HDA ATI HDMI], device 3: HDMI 0 [HDMI 0] Subdevices: 0/1 Subdevice #0: subdevice #0

    Read the article

  • Why does 12.04 try but fail to hibernate, even after I enabled hibernation?

    - by Roger Davis
    Regarding the below (-----) info from another post, my system will try (as is said below) to hibernate, but it won't get all the way there. Hard drive activity stops, but it does not shut down. If I turn the power off, then back on, it will start, but I have to "restore previous session" in the browser and other open apps don't restart, with the accompanying hassle. So because it tries, will the suggested fix then cause it to work correctly, or is the literal "try" not really what he means?!? PLEASE NOTE - this is a desktop system, not a laptop. Before enabling hibernation, please try to test whether it works correctly by running pm-hibernate in a terminal. The system will try to hibernate. If you are able to start the system again then you are more or less safe to add an override. To do so, start editing sudo nano /etc/polkit-1/localauthority/50-local.d/com.ubuntu.enable-hibernate.pkla Fill it with this [Re-enable hibernate by default] Identity=unix-user:* Action=org.freedesktop.upower.hibernate ResultActive=yes Save by pressing Ctrl-O and exit nano by pressing Ctrl-X Restart and hibernation is back!

    Read the article

  • Site overthrown by Turkish hackers...

    - by Jackson Gariety
    Go ahead, laugh. I forgot to remove the default admin/admin account on my blog. SOmebody got in and has replaced my homepage with some internet graffiti. I've used .htaccess to replace the page with a 403 error, but no matter what I do, my wordpress homepage is this hacker thing. How can I setup my server so that ONLY MYSELF can view it while I'm fixing this via .htaccess? What steps should I take to eradicate them from my server? If I delete the ENTIRE website and change all the passwords, is he completely gone? Thanks.

    Read the article

  • The long road to bug-free software

    - by Tony Davis
    The past decade has seen a burgeoning interest in functional programming languages such as Haskell or, in the Microsoft world, F#. Though still on the periphery of mainstream programming, functional programming concepts are gradually seeping into the imperative C# language (for example, Lambda expressions have their root in functional programming). One of the more interesting concepts from functional programming languages is the use of formal methods, the lofty ideal behind which is bug-free software. The idea is that we write a specification that describes exactly how our function (say) should behave. We then prove that our function conforms to it, and in doing so have proved beyond any doubt that it is free from bugs. All programmers already use one form of specification, specifically their programming language's type system. If a value has a specific type then, in a type-safe language, the compiler guarantees that value cannot be an instance of a different type. Many extensions to existing type systems, such as generics in Java and .NET, extend the range of programs that can be type-checked. Unfortunately, type systems can only prevent some bugs. To take a classic problem of retrieving an index value from an array, since the type system doesn't specify the length of the array, the compiler has no way of knowing that a request for the "value of index 4" from an array of only two elements is "unsafe". We restore safety via exception handling, but the ideal type system will prevent us from doing anything that is unsafe in the first place and this is where we start to borrow ideas from a language such as Haskell, with its concept of "dependent types". If the type of an array includes its length, we can ensure that any index accesses into the array are valid. The problem is that we now need to carry around the length of arrays and the values of indices throughout our code so that it can be type-checked. In general, writing the specification to prove a positive property, even for a problem very amenable to specification, such as a simple sorting algorithm, turns out to be very hard and the specification will be different for every program. Extend this to writing a specification for, say, Microsoft Word and we can see that the specification would end up being no simpler, and therefore no less buggy, than the implementation. Fortunately, it is easier to write a specification that proves that a program doesn't have certain, specific and undesirable properties, such as infinite loops or accesses to the wrong bit of memory. If we can write the specifications to prove that a program is immune to such problems, we could reuse them in many places. The problem is the lack of specification "provers" that can do this without a lot of manual intervention (i.e. hints from the programmer). All this might feel a very long way off, but computing power and our understanding of the theory of "provers" advances quickly, and Microsoft is doing some of it already. Via their Terminator research project they have started to prove that their device drivers will always terminate, and in so doing have suddenly eliminated a vast range of possible bugs. This is a huge step forward from saying, "we've tested it lots and it seems fine". What do you think? What might be good targets for specification and verification? SQL could be one: the cost of a bug in SQL Server is quite high given how many important systems rely on it, so there's a good incentive to eliminate bugs, even at high initial cost. [Many thanks to Mike Williamson for guidance and useful conversations during the writing of this piece] Cheers, Tony.

    Read the article

  • C++ and SDL Trouble Creating a STL Vector of a Game Object

    - by Jackson Blades
    I am trying to create a Space Invaders clone using C++ and SDL. The problem I am having is in trying to create Waves of Enemies. I am trying to model this by making my Waves a vector of 8 Enemy objects. My Enemy constructor takes two arguments, an x and y offset. My Wave constructor also takes two arguments, an x and y offset. What I am trying to do is have my Wave constructor initialize a vector of Enemies, and have each enemy given a different x offset so that they are spaced out appropriately. Enemy::Enemy(int x, int y) { box.x = x; box.y = y; box.w = ENEMY_WIDTH; box.h = ENEMY_HEIGHT; xVel = ENEMY_WIDTH / 2; } Wave::Wave(int x, int y) { box.x = x; box.y = y; box.w = WAVE_WIDTH; box.y = WAVE_HEIGHT; xVel = (-1)*ENEMY_WIDTH; yVel = 0; std::vector<Enemy> enemyWave; for (int i = 0; i < enemyWave.size(); i++) { Enemy temp(box.x + ((ENEMY_WIDTH + 16) * i), box.y); enemyWave.push_back(temp); } } I guess what I am asking is if there is a cleaner, more elegant way to do this sort of initialization with vectors, or if this is right at all. Any help is greatly appreciated.

    Read the article

  • What's the best way to sell ReSharper to management? [closed]

    - by Jackson Pope
    Possible Duplicate: How do you convince your boss to buy useful tools like Resharper, LinqPad? I've recently started a new job developing code in C# and ASP.Net. At a previous employer I've used ReSharper from JetBrains and I loved it. I've downloaded the free trial in my new job, as have several of my new colleagues on my recommendation. Everyone thinks it's great. But now our trials are coming to an end and it's time to buy or say goodbye. I've been reliably informed that getting money for tools from senior management is like trying to get blood from a stone, so how can I convince them to loosen their grip on the purse strings and buy it for our team (of seven developers)? Does anyone have any experience of convincing management of the benefits of refactoring tools? I feel the benefit every second I use it, but I'm having difficulty thinking of how to explain the concrete benefits to a manager who only think

    Read the article

  • Oracle Linux Partner Pavilion Spotlight - Part II

    - by Ted Davis
    As we draw closer to the first day of Oracle OpenWorld, starting in less than a week, we continue to showcase some of our premier partners exhibiting in the Oracle Linux Partner Pavilion ( Booth #1033). We have Independent Hardware Vendors, Independent Software Vendors and Systems Integrators that show the breadth of support in the Oracle Linux and Oracle VM ecosystem. In today's post we highlight three additional Oracle Linux / Oracle VM Partners from the pavilion. Micro Focus delivers mainframe solutions software and software delivery tools with its Borland products. These tools are grouped under the following solutions: Analysis and testing tools for JDeveloper Micro Focus Enterprise Analyzer is key to the success of application overhaul and modernization strategies by ensuring that they are based on a solid knowledge foundation. It reveals the reality of enterprise application portfolios and the detailed constructs of business applications. COBOL for Oracle Database, Oracle Linux, and Tuxedo Micro Focus Visual COBOL delivers the next generation of COBOL development and deployment. Itbrings the productivity of the Eclipse IDE to COBOL, and provides the ability to deploy key business critical COBOL applications to Oracle Linux both natively and under a JVM. Migration and Modernization tooling for mainframes Enterprise application knowledge, development, test and workload re-hosting tools significantly improves the efficiency of business application delivery, enabling CIOs and IT leaders to modernize application portfolios and target platforms such as Oracle Linux. When it comes to Oracle Linux database environments, supporting high transaction rates with minimal response times is no longer just a goal. It’s a strategic imperative. The “data deluge” is impacting the ability of databases and other strategic applications to access data and provide real-time analytics and reporting. As such, customer demand for accelerated application performance is increasing. Visit LSI at the Oracle Linux Pavilion, #733, to find out how LSI Nytro Application Acceleration products are designed from the ground up for database acceleration. Our intelligent solid-state storage solutions help to eliminate I/O bottlenecks, increase throughput and enable Oracle customers achieve the highest levels of DB performance. Accelerate Your Exadata Success With Teleran. Teleran’s software solutions for Oracle Exadata and Oracle Database reduce the cost, time and effort of migrating and consolidating applications on Exadata. In addition Teleran delivers visibility and control solutions for BI/data warehouse performance and user management that ensure service levels and cost efficiency.Teleran will demonstrate these solutions at the Oracle Open World Linux Pavilion: Consolidation Accelerator - Reduces the cost, time and risk ofof migrating and consolidation applications on Exadata. Application Readiness – Identifies legacy application performance enhancements needed to take advantage of Exadata performance features Workload Accelerator – Identifies and clusters workloads for faster performance on Exadata Application Visibility and Control - Improves performance, user productivity, and alignment to business objectives while reducing support and resource costs. Thanks for reading today's Partner Spotlight. Three more partners will be highlighted tomorrow. If you missed our first Partner Spotlight check it out here.

    Read the article

  • Consuming too much power on a Dell 15R

    - by Daniel Davis
    I'm new to Ubuntu, and I've just installed Ubuntu 11.10, and I must say I really like it a lot. I'm getting used to Ubuntu faster than I thought but the only issue I'm having is the power consumption. I have a " dell 15R with BIOS a07" and I uninstalled Ubuntu 11.04 because of some glitches I hated. None of those appear on this Ubuntu but now my computer discharges way too quick. when I check the power icon, it states that I have like 3:45 min remaining, which is fairly the same as what win7 and Ubuntu 11.04 used to tell me but 15 min later I check again and it gives me only 1:15 min remaining. Also, my computer seems to get a lot hotter than it used to. Is there more options to control the power usage on my laptop?

    Read the article

  • Instantiating objects in Java

    - by Davis Naglis
    I'm learning now Java from scratch and when I started to learn about instantiating objects, I don't understand - in which cases do I need to instantiate objects? For example I'm studying from TutsPlus course about it and there is example about "Rectangle" class. Instructor says that it needs to be instantiated. So I started to doubt about - when do I need to instantiate those objects when writing Java code?

    Read the article

  • The long road to bug-free software

    - by Tony Davis
    The past decade has seen a burgeoning interest in functional programming languages such as Haskell or, in the Microsoft world, F#. Though still on the periphery of mainstream programming, functional programming concepts are gradually seeping into the imperative C# language (for example, Lambda expressions have their root in functional programming). One of the more interesting concepts from functional programming languages is the use of formal methods, the lofty ideal behind which is bug-free software. The idea is that we write a specification that describes exactly how our function (say) should behave. We then prove that our function conforms to it, and in doing so have proved beyond any doubt that it is free from bugs. All programmers already use one form of specification, specifically their programming language's type system. If a value has a specific type then, in a type-safe language, the compiler guarantees that value cannot be an instance of a different type. Many extensions to existing type systems, such as generics in Java and .NET, extend the range of programs that can be type-checked. Unfortunately, type systems can only prevent some bugs. To take a classic problem of retrieving an index value from an array, since the type system doesn't specify the length of the array, the compiler has no way of knowing that a request for the "value of index 4" from an array of only two elements is "unsafe". We restore safety via exception handling, but the ideal type system will prevent us from doing anything that is unsafe in the first place and this is where we start to borrow ideas from a language such as Haskell, with its concept of "dependent types". If the type of an array includes its length, we can ensure that any index accesses into the array are valid. The problem is that we now need to carry around the length of arrays and the values of indices throughout our code so that it can be type-checked. In general, writing the specification to prove a positive property, even for a problem very amenable to specification, such as a simple sorting algorithm, turns out to be very hard and the specification will be different for every program. Extend this to writing a specification for, say, Microsoft Word and we can see that the specification would end up being no simpler, and therefore no less buggy, than the implementation. Fortunately, it is easier to write a specification that proves that a program doesn't have certain, specific and undesirable properties, such as infinite loops or accesses to the wrong bit of memory. If we can write the specifications to prove that a program is immune to such problems, we could reuse them in many places. The problem is the lack of specification "provers" that can do this without a lot of manual intervention (i.e. hints from the programmer). All this might feel a very long way off, but computing power and our understanding of the theory of "provers" advances quickly, and Microsoft is doing some of it already. Via their Terminator research project they have started to prove that their device drivers will always terminate, and in so doing have suddenly eliminated a vast range of possible bugs. This is a huge step forward from saying, "we've tested it lots and it seems fine". What do you think? What might be good targets for specification and verification? SQL could be one: the cost of a bug in SQL Server is quite high given how many important systems rely on it, so there's a good incentive to eliminate bugs, even at high initial cost. [Many thanks to Mike Williamson for guidance and useful conversations during the writing of this piece] Cheers, Tony.

    Read the article

  • Concurrent Affairs

    - by Tony Davis
    I once wrote an editorial, multi-core mania, on the conundrum of ever-increasing numbers of processor cores, but without the concurrent programming techniques to get anywhere near exploiting their performance potential. I came to the.controversial.conclusion that, while the problem loomed for all procedural languages, it was not a big issue for the vast majority of programmers. Two years later, I still think most programmers don't concern themselves overly with this issue, but I do think that's a bigger problem than I originally implied. Firstly, is the performance boost from writing code that can fully exploit all available cores worth the cost of the additional programming complexity? Right now, with quad-core processors that, at best, can make our programs four times faster, the answer is still no for many applications. But what happens in a few years, as the number of cores grows to 100 or even 1000? At this point, it becomes very hard to ignore the potential gains from exploiting concurrency. Possibly, I was optimistic to assume that, by the time we have 100-core processors, and most applications really needed to exploit them, some technology would be around to allow us to do so with relative ease. The ideal solution would be one that allows programmers to forget about the problem, in much the same way that garbage collection removed the need to worry too much about memory allocation. From all I can find on the topic, though, there is only a remote likelihood that we'll ever have a compiler that takes a program written in a single-threaded style and "auto-magically" converts it into an efficient, correct, multi-threaded program. At the same time, it seems clear that what is currently the most common solution, multi-threaded programming with shared memory, is unsustainable. As soon as a piece of state can be changed by a different thread of execution, the potential number of execution paths through your program grows exponentially with the number of threads. If you have two threads, each executing n instructions, then there are 2^n possible "interleavings" of those instructions. Of course, many of those interleavings will have identical behavior, but several won't. Not only does this make understanding how a program works an order of magnitude harder, but it will also result in irreproducible, non-deterministic, bugs. And of course, the problem will be many times worse when you have a hundred or a thousand threads. So what is the answer? All of the possible alternatives require a change in the way we write programs and, currently, seem to be plagued by performance issues. Software transactional memory (STM) applies the ideas of database transactions, and optimistic concurrency control, to memory. However, working out how to break down your program into sufficiently small transactions, so as to avoid contention issues, isn't easy. Another approach is concurrency with actors, where instead of having threads share memory, each thread runs in complete isolation, and communicates with others by passing messages. It simplifies concurrent programs but still has performance issues, if the threads need to operate on the same large piece of data. There are doubtless other possible solutions that I haven't mentioned, and I would love to know to what extent you, as a developer, are considering the problem of multi-core concurrency, what solution you currently favor, and why. Cheers, Tony.

    Read the article

  • Oracle Linux Partner Pavilion Spotlight III

    - by Ted Davis
    Three days until Oracle OpenWorld 2012 begins. The anticipation and excitement are building. In today's spotlight we are presenting an additional three partners exhibiting in the Oracle Linux Partner Pavilion at Oracle OpenWorld ( Booth #1033). Fujitsu will showcase a Gold tower system representing the one-millionth PRIMERGY server shipped, highlighting Fujitsu’s position as the #4 server vendor worldwide. Fujitsu’s broad range of server platforms is reshaping the data center with virtualization and cloud services, including those based on Oracle Linux and Oracle VM. BeyondTrust, the leader in providing context aware security intelligence, will be showcasing its threat management and policy enablement solutions for addressing IT security risks and simplifying compliance. BeyondTrust will discuss how to reduce security risks, close security gaps and improve visibility across your server and database infrastructure. Please stop by to see live demonstrations of BeyondTrust’s award winning vulnerability management and privilege identity management solutions supported on Oracle Linux. Virtualized infrastructure with Oracle VM and NetApp storage and data management solutions provides an integrated and seamless end user experience. Designed for maximum efficiency to allow for native NetApp deduplication and backup/recovery/cloning of VM’s or templates. Whether you are provisioning one or multiple server pools or dynamically re-provisioning storage for your virtual machines to meet business demands, with Oracle and NetApp, you have one single point-and-click console to rapidly and easily deploy a virtualized agile data infrastructure in minutes. So there you have it!  The third install of our Partner Spolight. Check out Part I and Part II of our Partner Spotlights from previous days if you've missed them. Remember to visit the Oracle Linux team at Oracle OpenWorld.

    Read the article

  • jquery is not working over local network [migrated]

    - by Kortyell Davis
    i have a fedora server running apache web server. the server is connected to a home network. i have a laptop connected to the same network. i can enter the ip address of my server into the browser of my laptop and pull up the index.html file located in the document root directory of the fedora home server. the index.html file contains jquery code. the jquery code only works when i open it locally in my browser (e.g. right click open with firefox), but when i attempt to view the webpage from my laptop the jquery code is not executed. the code is here below. <script type="text/javascript" src="jquery-1.8.2.js"></script> ' $(document).ready(function() { $('#form').hide(); $('input[type=text]').focus(function() { $(this).val(''); }); $('input[type=password]').focus(function() { $(this).val(''); }); $('.form').hide(); $('#log').click(function(){ $('#form').toggle(); }); $('#reg').click(function(){ $('.form').toggle(); }); });

    Read the article

  • how to convert bitmap image to CIELab image?

    - by Neal Davis
    Hello to stackoverflow members! This is my first post but I love reading the questions and answers on this forum! I'm using VS2008, CSharp, NET 3.5 and Windows XP Pro latest patches on this project. I am reading in a jpeg file to a bitmap image using CSharp in the following way: Bitmap bmp = new Bitmap("background.jpg"); Graphics g = Graphics.FromImage(bmp); I then write some text and logos on g using various DrawString and DrawImage commands. I then want to write this bitmap to a tiff file using the command: bmp.Save("final_artwork.tif", ImageFormat.Tiff); I don't know what the TIFF tags in the header of the final resultant TIFF file will be when you do bmp.Save as shown above? But what I want to produce is a TIFF image file with PHOTOMETRIC CIELab, BITSPERSAMPLE 8, SAMPLESPERPIXEL 3, FILLORDER MSB2LSB, ORIENTATION TOPLEFT, PLANARCONFIG CONTIG, and also uncompressed and singlestrip. Im fairly certain, at least I have not found anything about Microsoft API and CIELab, I can not convert the bitmap image to CIELab photometric and the other requirements using anything from the Microsoft API or .NET 3.5 environment - someone please correct if I am wrong. I have used libtiff in a project in the past but just to write a tiff file where the tiff image in memory was already in CIELab format and all the other requirements as noted per above. Im not certain libtiff can convert from my CSharp bitmap image, which is probably RGB photometric, to CIELab, 8 bps, and 3 spp? Maybe I can use OpenCVImage or Emgu CV or something similiar? So the main question is what is going to be the easiest way to get my final bitmap image as outlined above into a CIELab TIFF and also meeting the other requirements per above? Thanks for any suggestions and code fragments you can provide? Neal Davis

    Read the article

  • Windows file compare (FC) spurious differences

    - by user165568
    I'm getting differences like this: a.txt Betty Davis Cathy Edwards b.txt Betty Davis Cathy Edwards There are only two lines listed in the diff (which doesn't make sense). No CR/LF/Newline funnies. The difference just moves down if I delete lines. Same problem on Win7 and Win2K. The difference seems to go away if I remove all empty lines from the files. The empty lines are correctly terminiated too. Using /C /W (ignore case, ignore whitespace) Has anyone seen this before? What am I doing wrong? How can I fix it? There are real diffs in the file -missing, extra, or re-spelled names- but the files are byte-for-byte identical at the listed diff.

    Read the article

  • Google Talk Chat/Conference Solutions

    - by Adam Davis
    I started using the old confbot python conference script in 2005 for my family. This essentially implements an IRC like conference room over Google Talk (or any Jabber/XMPP server). It has significantly increased family communication, and has become rather indispensable due to this. Recently it's begun to have severe problems (people can't see each other in the conference room) which has nearly killed the usefulness of it. Before I develop my own software or debug confbot (probably not - it uses an older jabber library that hasn't been updated since 2003) I wanted to see what other solutions exist that meet our needs: Supports Google Talk (Sorry, I'm not going to try to convince everyone involved to move to a new IM or other client) Free and open source (ideal, but not required) Runs on Windows (Not a web service run by someone else) Implements basic functionality such as kick/ban, emotes Remembers who joined the conference room across restarts Obeys Do Not Disturb and Busy status Archives all activity -Adam

    Read the article

  • Where can I ask questions that aren't IT questions?

    - by Adam Davis
    Thank you for your confidence in our abilities But have you read this? FAQ We get a lot of interesting technical questions on here, but Server Fault is meant to be first and foremost a resource for system administrators and IT professionals. In other words, Server Fault isn't meant to cater to every computer problem - just those that only exist in or can best be solved in the IT and sysadmin domain Yes, someone here might be able to help you, but you'll find that other forums more focused on your topic can give you a much better answer than a bunch of IT professionals. It's likely that your question will be downvoted, closed, and in some cases marked "offensive." It's not that we hate you, it's just that we like to keep our corn pops separate from our cocoa puffs. In that vein, here are a bunch of other forums where you can get help: (This list contains the top two forums for each category as voted for below.) Programming Stackoverflow Consumer Level Computer Hardware Ars Technica OpenForum Tom's Hardware Forums Computer Software Anandtech Forum Web Design/Hosting/CMS SitePoint Forums Web Hosting Talk Math, Science, Engineering Physics Forums PlanetMath Other/General Ask Metafilter Google Search Mahalo Answers Check out the answers below for even more suggestions! What forums can people go to to ask the questions that are off topic here? Please list only ONE forum per answer so votes can bring the best forums to the top. Return to FAQ Index

    Read the article

  • Can an installation of SSRS be used for other reports if the SCOM Reporting Role is installed?

    - by Pete Davis
    I'm currently in the process of planing a SCOM 2007 R2 deployment and would like to deploy the OperationsManagerDW and Reporting Server to a shared SQL 2008 cluster which is used for reporting across multiple solutions. However in the in the deployment guide for SCOM 2007 R2 it says: Due to changes that the Operations Manager 2007 Reporting component makes to SQL Server Reporting Services security, no other applications that make use of SQL Server Reporting Services can be installed on this server. Which concerns me that it may interfere with existing or future (non SCOM) reports in some way even if deployed as a separate SSRS instance. Later in the same guide it states: Installing Operations Manager 2007 Reporting Services integrates the security of the instance of SQL Reporting Services with the Operations Manager role-based security. Do not install any other Reporting Services applications in this same instance of SQL Server. Does this mean that I can install a new SSRS instance and use this on the shared cluster for SCOM reporting or that I'd also need to create a whole new SQL Server instance as well as SSRS instance or I'd need a whole separate server for SCOM OperationsManagerDW and Reporitng Server?

    Read the article

  • creating email forwarding accounts on the fly using

    - by Phil Jackson
    I've recently asked a question on Stack Overflow; http://stackoverflow.com/questions/2637137/creating-email-forwarding-accounts-on-the-fly-using-php and was I would find the solution here. first instruction was: Ask on the serverfault what operations you have have to perform. what config file to edit or something. Any help would be much appreciated. regards, phil

    Read the article

  • Is there a way to know what the Windows Disk Cleanup utility will delete?

    - by Cam Jackson
    When I run the Disk Cleanup utility that's built into Windows 8, it tells me that it can free up 53GB by deleting 'Temporary Files'. However, a CCleaner analysis on default settings only finds about 300MB worth of space to free up, so I'm wondering what Disk Cleanup has found that CCleaner does not. Note that this question appears to be similar to what I'm asking, but the accepted answer says that 'Temporary Files' refers to %TEMP%. I've already cleared out most of C:\Users\Cam\AppData\Local\Temp, and it now has only 230MB of stuff in it, even with system files showing. So where is this 53GB located? Is there a way to find out what it is? Edit: I should note that this is on a 110GB SSD, so it's almost half the drive. And in fact I'm only using 86GB, so if it's really going to clear out 53GB, that would be more than 60% of the stuff on my C drive. I'm starting to think that Disk Cleanup caches its analysis, and hasn't updated since I started cleaning up the drive earlier today. Although when I run it it says that it's 'Calculating' how much space can be saved, and it takes about 5-10 seconds to do so. Hmmm... Edit2: Here is what my hard drive looks like, according to SpaceMonger (Right click-Open image in new tab, so you can see it properly): You can see why I was starting to think that the 53GB figure is actually wrong. Even if 'Temporary Files' includes my hiberfil and everything in WinSxS (about 13GB total), that would be 26GB, which is only halfway there. Hard to see where there's 53GB of stuff to delete.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >