Search Results

Search found 3414 results on 137 pages for 'happy emi'.

Page 52/137 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Can I migrate from GNU Mailman to MailChimp?

    - by Flowpoke
    I have ~20 lists, all of which are mostly announce-only (newsletters--recipients do not reply back to the list) running in GNU Mailman. It's alright. Mailman has certainly prooven itself but we want some progressive features and a better peice of mind (delivery success, hosting, etc... we'd be happy paying a 3rd party to handle these things). can MailChimp give us what we need? I see tons of fun copy and graphics, showing off neat features but what I really want to do is; if MailChimp is doing the mailings, what does the address look like? is MailChimp good for sending out simple newsletters? What about automatic bounce processing / unsubscribing of users? I setup a free account but I don't see how any of it integrates into my own domain... no DNS overrides or cname suggestions. Also, I see MailChimp has a clean and nifty API client in Python that I want to integrate into our sites (Django powered) so that really really makes the service attractive to me--I just hope I understand it correctly.

    Read the article

  • Dutch ACEs SOA Partner Community Award Celebration

    - by JuergenKress
    When you win you need to celebrate. This was the line of thinking when I found out that I was part of a group that won the Oracle SOA Community Country Award. Well – thinking about a party is one thing, preparing it and finally having the small party is something completely different. It starts with finding a date that would be suitable for the majority of invited people. As you can imagine the SOA ACEs and ACE Directors have a busy life, that takes them places. Alongside that they are engaged with customers who want to squeeze every bit of knowledge out of them. So everybody is pretty busy (that’s what makes you an ACE). After some deliberation (and checks of international Oracle events, Trip-it, blogs and tweets) a date was chosen. Meeting on a Friday evening for some drinks is probably not a Dutch-only activity. But as some of the ACEs are self-employed they miss the companies around them to organize such events. Come the day a turn-out of almost 50% was great – although I expected some more folks . This was mainly due to some illness and work overload. Luckily the mini-party got going, (alcoholic) beverages were consumed, food was appreciated, a decent picture was made (see below) and all had a good chat and hopefully a good time. (Above from left to right: Eric Elzinga, Andreas Chatziantoniou, Mike van Aalst, Edwin Biemond) All in all a nice evening and certainly a "meeting" which can be repeated.  For the full article please visit Andreas's blog Want to organize a local SOA & BPM community? Let us know we are more than happy to support you! To receive more information become a member of the SOA & BPM Partner Community please register at http://www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Technorati Tags: Eric Elzinga,Andreas Chatziantoniou,Mike van Aalst,Edwin Biemond,Dutsch SOA Community,SOA Community,Oracle,OPN,Jürgen Kress,ACE

    Read the article

  • Week 11: Spring Break Destination: Specialization

    - by sandra.haan
    Oh how we miss Spring Break - a whole week off from school to play in the sun and get re-charged. You are probably sitting at your computer right now wishing your feet were in the sand on a warm beach somewhere instead of at your desk. Sadly, we can't transport you to a tropical paradise, but we can offer a quick Spring Break with OPN Specialized (shoes optional). Ingredients: 1 dose of Sun FAQ 1 pinch of OPN Specialized awareness 6 OPN Specialized Webcasts 1.5 months of promotional pricing Slather yourself in Sun knowledge by reviewing the FAQ. Once armed with the direction for Sun partners, relax and dive into a good read on OPN Specialized - ahh yes, that's right - the new OPN program offering you the ability to differentiate yourself. You must be exhausted from all of that work - you are on break after all. Once rested, map out an excursion and plan to attend 1 of 6 upcoming OPN Specialized sessions. These will walk you through the steps you need to take to become Specialized. Once completed, reflect on your journey and join OPN Specialized while the promotional pricing is still available. Just like any other trip, you want to know what others are saying about the destination - listen in as Judson talks about the OPN Specialized Webcast series: Feel free to add your own ingredients to this recipe and don't forget to reach out to the Oracle Partner Business Center with any of your questions on OPN Specialized. Happy Spring Break, The OPN Communications Team

    Read the article

  • Message Passing Interface (MPI)

    So you have installed your cluster and you are done with introductory material on Windows HPC. Now you want to develop an application with the most common programming model: Message Passing Interface.The MPI programming model is a standard with implementations from many vendors. For newbies (like myself!), I have aggregated below links for getting started.Non-Microsoft MPI resources (useful even if you are not on the Windows platform)1. Message Passing Interface on wikipedia. 2. The MPI standard.3. MPICH2 - an MPI implementation.4. Tutorial on MPI by William Gropp.5. MPI patterns presented as a tutorial with sample code. 6. THE official MPI Forum (maintains the standard) including the wiki discussing the MPI future.7. Great MPI tutorial including at the end the MPI Exercise.8. C++ MPI Exercises by John Burkardt.9. Book online: MPI The Complete Reference.MS-MPI10. Windows HPC Server 2008 - Using MS-MPI whitepaper (15 page doc).11. Tracing MPI applications (27 page doc).12. Using Microsoft MPI (TechNet section).13. Windows HPC Server MPI forum (for posting questions). MPI.NET14. MPI.NET Home Page (not owned by Microsoft).15. MPI.NET Tutorial.16. HPC Development using F# using MPI.NET (38 page doc).Next time I'll post resources for the Microsoft Cluster SOA programming model - happy coding... Comments about this post welcome at the original blog.

    Read the article

  • Microsoft SQL Server 2012 Analysis Services – The BISM Tabular Model #ssas #tabular #bism

    - by Marco Russo (SQLBI)
    I, Alberto and Chris spent many months (many nights, holidays and also working days of the last months) writing the book we would have liked to read when we started working with Analysis Services Tabular. A book that explains how to use Tabular, how to model data with Tabular, how Tabular internally works and how to optimize a Tabular model. All those things you need to start on a real project in order to make an happy customer. You know, we’re all consultants after all, so customer satisfaction is really important to be paid for our job! Now the book writing is finished, we’re in the final stage of editing and reviews and we look forward to get our print copy. Its title is very long: Microsoft SQL Server 2012 Analysis Services – The BISM Tabular Model. But the important thing is that you can already (pre)order it. This is the list of chapters: 01. BISM Architecture 02. Guided Tour on Tabular 03. Loading Data Inside Tabular 04. DAX Basics 05. Understanding Evaluation Contexts 06. Querying Tabular 07. DAX Advanced 08. Understanding Time Intelligence in DAX 09. Vertipaq Engine 10. Using Tabular Hierarchies 11. Data modeling in Tabular 12. Using Advanced Tabular Relationships 13. Tabular Presentation Layer 14. Tabular and PowerPivot for Excel 15. Tabular Security 16. Interfacing with Tabular 17. Tabular Deployment 18. Optimization and Monitoring And this is the book cover – have a good read!

    Read the article

  • Friday Tips #33

    - by Chris Kawalek
    Happy Friday, everyone! Our tip this week is from an excellent white paper written by our own Greg King titled Oracle VM 3: Building a Demo Environment using Oracle VM VirtualBox. In it, Greg gives you everything you need to know to set up Oracle VM Server inside of Oracle VM VirtualBox for testing and demoing. The section we're highlighting below is on how to configure the network interfaces of your virtual machines: VirtualBox comes with a few different types of network interfaces that can be used to allow communication between the VM guests and the host operating system, including network interfaces that will allow the VM guests to communicate with local and wide area networks accessed from your laptop or personal computer. However, for the purpose of the demonstration environment we will limit the network communication to include access just between your desktop and the virtual machines being managed by VirtualBox. The install process for Oracle VM VirtualBox creates a single host-only network device on your laptop or personal computer. Using the host-only network device will allow you to open a browser on your desktop to access the Oracle VM Manager running within the VirtualBox VM guest. The device will only allow network traffic between the VM guests and your host operating system, but nothing outside the confines of your laptop or personal computer. We will need to add a second host-only network since the Oracle VM Server appliance has both eth0 and eth1 configured. You can choose to use eth1 on the Oracle VM Servers or not use them – the choice is yours. But, at least the host side network device will exist if you decide to use it. Greg goes on to describe in detail how to setup the network interfaces, so you can head on over to the paper and get even more info. See you next week! -Chris 

    Read the article

  • BizTalk 2009 - BizTalk Server Best Practice Analyser

    - by StuartBrierley
    The BizTalk Server Best Practices Analyser  allows you to carry out a configuration level verification of your BizTalk installation, evaluating the deployed configuration but not modifying or tuning anything that it finds. The Best Practices Analyser uses "reading and reporting" to gather data from different sources, such as: Windows Management Instrumentation (WMI) classes SQL Server databases Registry entries When I first ran the analyser I got a number of errors, if you get any errors these should all be acted upon to resolve them, you should then run the scan again and see if any thing else is reported that needs acting upon. As you can see in the image above, the initial issue that jumped out to me was that the SQL Server Agent was not started. The reasons for this was absent mindedness - this run was against my development PC and I don't have SQL/BizTalk actively running unless I am using them.  Starting the agent service and running the scan again gave me the following results: This resolved most of the issues for me, but next major issue to look at was that there was no tracking host running.  You can also see that I was still getting an error with two of the SQL jobs.  The problem here was that I had not yet configured these two SQL jobs.  Configuring the backup and purge jobs and then starting the tracking host before running the scan again gave: This had cleared all the critical issues, but I did stil have a number of warnings.  For example on this report I was warned that the BizTalk Message box is hosted on the BizTalk Server.  While this is known to be less than ideal, it is as I expected on my development environment where I have installed Visual Studio, SQL and BizTalk on my laptop and I was happy to ignore this and other similar warnings. In your case you should take a look at any warnings you receive and decide what you want to do about each of them in turn.

    Read the article

  • SOA Governance Starts with People and Processes

    - by Jyothi Swaroop
    While we all agree that SOA Governance is about People, Processes and Technology. Some experts are of the opinion that SOA Governance begins with People and Processes but needs to be empowered with technology to achieve the best results. Here's an interesting piece from David Linthicum on eBizq: In the world of SOA, the concept of SOA governance is getting a lot of attention. However, how SOA governance is defined and implemented really depends on the SOA governance vendor who just left the building within most enterprises. Indeed, confusion is a huge issue when considering SOA governance, and the core issues are more about the fundamentals of people and processes, and not about the technology. SOA governance is a concept used for activities related to exercising control over services in an SOA, including tracking the services, monitoring the service, and controlling changes made to the services, simple put. The trouble comes in when SOA governance vendors attempt to define SOA governance around their technology, all with different approaches to SOA governance. Thus, it's important that those building SOAs within the enterprise take a step back and understand what really need to support the concept of SOA governance. The value of SOA governance is pretty simple. Since services make up the foundation of an SOA, and are at their essence the behavior and information from existing systems externalized, it's critical to make sure that those accessing, creating, and changing services do so using a well controlled and orderly mechanism. Those of you, who already have governance in place, typically around enterprise architecture efforts, will be happy to know that SOA governance does not replace those processes, but becomes a mechanism within the larger enterprise governance concept. People and processes are first thing on the list to get under control before you begin to toss technology at this problem. This means establishing an understanding of SOA governance within the team members, including why it's important, who's involved, and the core processes that are to be follow to make SOA governance work. Indeed, when creating the core SOA governance strategy should really be independent of the technology. The technology will change over the years, but the core processes and discipline should be relatively durable over time.

    Read the article

  • Installing SharePoint 2010 and PowerPivot for SharePoint on Windows 7

    - by smisner
    Many people like me want (or need) to do their business intelligence development work on a laptop. As someone who frequently speaks at various events or teaches classes on all subjects related to the Microsoft business intelligence stack, I need a way to run multiple server products on my laptop with reasonable performance. Once upon a time, that requirement meant only that I had to load the current version of SQL Server and the client tools of choice. In today's post, I'll review my latest experience with trying to make the newly released Microsoft BI products work with a Windows 7 operating system. The entrance of Microsoft Office SharePoint Server 2007 into the BI stack complicated matters and I started using Virtual Server to establish a "suitable" environment. As part of the team that delivered a lot of education as part of the Yukon pre-launch activities (that would be SQL Server 2005 for the uninitiated), I was working with four - yes, four - virtual servers. That was a pretty brutal workload for a 2GB laptop, which worked if I was very, very careful. It could also be a finicky and unreliable configuration as I learned to my dismay at one TechEd session several years ago when I had to reboot a very carefully cached set of servers just minutes before my session started. Although it worked, it came back to life very, very slowly much to the displeasure of the audience. They couldn't possibly have been less pleased than me. At that moment, I resolved to get the beefiest environment I could afford and consolidate to a single virtual server. Enter the 4GB 64-bit laptop to preserve my sanity and my livelihood. Likewise, for SQL Server 2008, I managed to keep everything within a single virtual server and I could function reasonably well with this approach. Now we have SQL Server 2008 R2 plus Office SharePoint Server 2010. That means a 64-bit operating system. Period. That means no more Virtual Server. That means I must use Hyper-V or another alternative. I've heard alternatives exist, but my few dabbles in this area did not yield positive results. It might have been just me having issues rather than any failure of those technologies to adequately support the requirements. My first run at working with the new BI stack configuration was to set up a 64-bit 4GB laptop with a dual-boot to run Windows Server 2008 R2 with Hyper-V. However, I was generally not happy with running Windows Server 2008 R2 on my laptop. For one, I couldn't put it into sleep mode, which is helpful if I want to prepare for a presentation beforehand and then walk to the podium without the need to hold my laptop in its open state along the way (my strategy at the TechEd session long, long ago). Secondly, it was finicky with projectors. I had issues from time to time and while I always eventually got it to work, I didn't appreciate those nerve-wracking moments wondering whether this would be the time that it wouldn't work. Somewhere along the way, I learned that it was possible to load SharePoint 2010 in a Windows 7 which piqued my interest. I had just acquired a new laptop running Windows 7 64-bit, and thought surely running the BI stack natively on my laptop must be better than running Hyper-V. (I have not tried booting to Hyper-V VHD yet, but that's on my list of things to try so the jury of one is still out on this approach.) Recently, I had to build up a server with the RTM versions of SQL Server 2008 R2 and Sharepoint Server 2010 and decided to follow suit on my Windows 7 Ultimate 64-bit laptop. The process is slightly different, but I'm happy to report that it IS possible, although I had some fits and starts along the way. DISCLAIMER: These products are NOT intended to be run in production mode on the Windows 7 operating system. The configuration described in this post is strictly for development or learning purposes and not supported by Microsoft. If you have trouble, you will NOT get help from them. I might be able to help, but I provide no guarantees of my ability or availablity to help. I won't provide the step-by-step instructions in this post as there are other resources that provide these details, but I will provide an overview of my approach, point you to the relevant resources, describe some of the problems I encountered, and explain how I addressed those problems to achieve my desired goal. Because my goal was not simply to set up SharePoint Server 2010 on my laptop, but specifically PowerPivot for SharePoint, I started out by referring to the installation instructions at the PowerPiovt-Info site, but mainly to confirm that I was performing steps in the proper sequence. I didn't perform the steps in Part 1 because those steps are applicable only to a server operating system which I am not running on my laptop. Then, the instructions in Part 2, won't work exactly as written for the same reason. Instead, I followed the instructions on MSDN, Setting Up the Development Environment for SharePoint 2010 on Windows Vista, Windows 7, and Windows Server 2008. In general, I found the following differences in installation steps from the steps at PowerPivot-Info: You must copy the SharePoint installation media to the local drive so that you can edit the config.xml to allow installation on a Windows client. You also have to manually install the prerequisites. The instructions provides links to each item that you must manually install and provides a command-line instruction to execute which enables required Windows features. I will digress for a moment to save you some grief in the sequence of steps to perform. I discovered later that a missing step in the MSDN instructions is to install the November CTP Reporting Services add-in for SharePoint. When I went to test my SharePoint site (I believe I tested after I had a successful PowerPivot installation), I ran into the following error: Could not load file or assembly 'RSSharePointSoapProxy, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. I was rather surprised that Reporting Services was required. Then I found an article by Alan le Marquand, Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010,that instructed readers to install the November add-in. My first reaction was, "Really?!?" But I confirmed it in another TechNet article on hardware and software requirements for SharePoint Server 2010. It doesn't refer explicitly to the November CTP but following the link took me there. (Interestingly, I retested today and there's no longer any reference to the November CTP. Here's the link to download the latest and greatest Reporting Services Add-in for SharePoint Technologies 2010.) You don't need to download the add-in anymore if you're doing a regular server-based installation of SharePoint because it installs as part of the prerequisites automatically. When it was time to start the installation of SharePoint, I deviated from the MSDN instructions and from the PowerPivot-Info instructions: On the Choose the installation you want page of the installation wizard, I chose Server Farm. On the Server Type page, I chose Complete. At the end of the installation, I did not run the configuration wizard. Returning to the PowerPivot-Info instructions, I tried to follow the instructions in Part 3 which describe installing SQL Server 2008 R2 with the PowerPivot option. These instructions tell you to choose the New Server option on the Setup Role page where you add PowerPivot for SharePoint. However, I ran into problems with this approach and got installation errors at the end. It wasn't until much later as I was investigating an error that I encountered Dave Wickert's post that installing PowerPivot for SharePoint on Windows 7 is unsupported. Uh oh. But he did want to hear about it if anyone succeeded, so I decided to take the plunge. Perseverance paid off, and I can happily inform Dave that it does work so far. I haven't tested absolutely everything with PowerPivot for SharePoint but have successfully deployed a workbook and viewed the PowerPivot Management Dashboard. I have not yet tested the data refresh feature, but I have installed. Continue reading to see how I accomplished my objective. I unintalled SQL Server 2008 R2 and started again. I had different problems which I don't recollect now. However, I uninstalled again and approached installation from a different angle and my next attempt succeeded. The downside of this approach is that you must do all of the things yourself that are done automatically when you install PowerPivot as a new server. Here are the steps that I followed: Install SQL Server 2008 R2 to get a database engine instance installed. Run the SharePoint configuration wizard to set up the SharePoint databases. In Central Administration, create a Web application using classic mode authentication as per a TechNet article on PowerPivot Authentication and Authorization. Then I followed the steps I found at How to: Install PowerPivot for SharePoint on an Existing SharePoint Server. Especially important to note - you must launch setup by using Run as administrator. I did not have to manually deploy the PowerPivot solution as the instructions specify, but it's good to know about this step because it tells you where to look in Central Administration to confirm a successful deployment. I did spot some incorrect steps in the instructions (at the time of this writing) in How To: Configure Stored Credentials for PowerPivot Data Refresh. Specifically, in the section entitled Step 1: Create a target application and set the credentials, both steps 10 and 12 are incorrect. They tell you to provide an actual Windows user name and password on the page where you are simply defining the prompts for your application in the Secure Store Service. To add the Windows user name and password that you want to associate with the application - after you have successfully created the target application - you select the target application and then click Set credentials in the ribbon. Lastly, I followed the instructions at How to: Install Office Data Connectivity Components on a PowerPivot server. However, I have yet to test this in my current environment. I did have several stops and starts throughout this process and edited those out to spare you from reading non-essential information. I believe the explanation I have provided here accurately reflect the steps I followed to produce a working configuration. If you follow these steps and get a different result, please let me know so that together we can work through the issue and correct these instructions. I'm sure there are many other folks in the Microsoft BI community that will appreciate the ability to set up the BI stack in a Windows 7 environment for development or learning purposes. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • IT Optimization Plan Pays Off For UK Retailer

    - by Brian Dayton
    I caught this article in ComputerworldUK yesterday. The headline talks about UK-based supermarket chain Morrisons is increasing their IT spend...OK, sounds good. Even nicer that Oracle is a big part of that. But what caught my eye were three things: 1) Morrison's truly has a long term strategy for IT. In this case, modernizing and optimizing how they use IT for business advantage.   2) Even in a tough economic climate, Morrison's views IT investments as contributing to and improving the bottom line. Specifically, "The investment in IT contributed to a 21 percent increase in Morrison's underlying profit.."   3) The phased, 3-year "Optimization Plan" took a holistic approach to their business--from CRM and Supply Chain systems to the underlying application infrastructure. On the infrastructure front, adopting a more flexible Service-Oriented Architecture enabled them to be more agile and adapt their business and Identity Management helped with sometimes mundane (but costly) issues like lost passwords and being able to document who has access to what.   Things don't always turn out so rosy. And I know it was a long and difficult process...but it's nice to see a happy ending every once in a while.  

    Read the article

  • Assembly load and execute issue

    - by Jean Carlos Suárez Marranzini
    I'm trying to develop Assembly code allowing me to load and execute(by input of the user) 2 other Assembly .EXE programs. I'm having two problems: -I don't seem to be able to assign the pathname to a valid register(Or maybe incorrect syntax) -I need to be able to execute the other program after the first one (could be either) started its execution. This is what I have so far: mov ax,cs ; moving code segment to data segment mov ds,ax mov ah,1h ; here I read from keyboard int 21h mov dl,al cmp al,'1' ; if 1 jump to LOADRUN1 JE LOADRUN1 popf cmp al,'2' ; if 1 jump to LOADRUN2 JE LOADRUN2 popf LOADRUN1: MOV AH,4BH MOV AL,00 LEA DX,[PROGNAME1] ; Not sure if it works INT 21H LOADRUN2: MOV AH,4BH MOV AL,00 LEA DX,[PROGNAME2] ; Not sure if it works INT 21H ; Here I define the bytes containing the pathnames PROGNAME1 db 'C:\Users\Usuario\NASM\Adding.exe',0 PROGNAME2 db 'C:\Users\Usuario\NASM\Substracting.exe',0 I just don't know how start another program by input in the 'parent' program, after one is already executing. Thanks in advance for your help! Any additional information I'll be more than happy to provide. -I'm using NASM 16 bits, Windows 7 32 bits.

    Read the article

  • High CPU usage compared to WinXP. Common aps and actions use 100% CPU cycles

    - by Jopower
    I'm running Lubunto 14.04.1 LTS. PC is a 2004 HP ze4200 laptop: 1.8 ghz Celeron M with 1 gb RAM and 80 gb drive. Was running fine on WinXP SP3 and I cleaned the drive off to test Lubuntu 14 LTS. No anti-virus is installed yet. I enabled the CPU resource monitor to see how various programs drag the OS. Using Firefox 31 online right now, I see doing basic functions like openning a new tab and scrolling down a page are using 100% CPU time, ocassionally for 10-30 seconds. In fact some pretty basic aps like Leafnote hit 100% for a second. Wordpad never did that. Lubuntu Software Center locks things up at 100% for 10 seconds. Just typing here shows a 60-80% spike every character. Running the mouse around the screen for 10 seconds results in a sustained 100% load during that time. Right now, if I let the PC rest just idling Firefox and not doing anything with it, CPU use bounces from 20-40% all by itself. WinXP idles at 2-10% and it's considered not good for it to be above 20%... something odd must be happening. Sure, XP will give similar higher CPU cycles with program use, but it's not locking and slogging like this. Lubuntu is supposed to be a light OS and by memory usage it is and I'm happy since this is an old PC maxed for memory upgrades. However, being used to doing some tuning and wary of abnormalities going on in the background, the CPU use indicates things going on that I want to know about and perhaps apply a tweek or two. Recommendations are appreciated. And this 300 point "new tags" restriction bites!

    Read the article

  • Announcing Release of Oracle Solaris Cluster 4.1!

    - by user9159196
    Oct 26, 2012We are very happy to announce the release of  Oracle Solaris Cluster 4.1, providing High Availability (HA) and  Disaster Recovery (DR) capabilities for Oracle Solaris 11.1.  This is yet another proof of Oracle's continued investment in Oracle Solaris technologies such as Oracle Solaris Cluster. For this new release we have improved the Solaris Cluster integration within the Oracle environment. For example  we've created new agents such as PeopleSoft JobScheduler or added the support of the Oracle ZFS Storage Appliance replication in the Geo Edition module (to facilitate disaster recovery in multi-site configuration equipped with those types of storage.) We have also extended the Oracle Solaris Zone Cluster feature with support of Oracle Solaris 10 zone clusters and exclusive-IP to facilitate deployment of virtualized or cloud architecture.And there are many more new features to discover in this release. Stay tuned for more specific articles. In the mean time check out the What's new document or even better, download the latest version from  here.Also, join the Oracle Solaris 11 Online Event on November 7 where an entire session will be devoted to discussing Oracle Solaris Cluster 4.1. Our Oracle Solaris Cluster engineers will be on hand to respond to your questions. We look forward to your feedback and inputs! -Nancy Chow and Eve Kleinknecht 

    Read the article

  • Operation MVC

    - by Ken Lovely, MCSE, MCDBA, MCTS
    It was time to create a new site. I figured VS 2010 is out so I should write it using MVC and Entity Framework. I have been very happy with MVC. My boss has had me making an administration web site in MVC2 but using 2008. I think one of the greatest features of MVC is you get to work with root of the app. It is kind of like being an iron worker; you get to work with the metal, mold it from scratch. Getting my articles out of my database and onto web pages was by far easier with MVC than it was with regular ASP.NET. This code is what I use to post the article to that page. It's pretty straightforward. The link in the menu is passes the id which is simply the url to the page. It looks for that url in the database and returns the rest of the article.   DataResults dr = new DataResults(); string title = string.Empty; string article = string.Empty; foreach (var D in dr.ReturnArticle(ViewData["PageName"].ToString())) { title = D.Title; article = D.Article; } public   List<CurrentArticle> ReturnArticle(string id) { var resultlist = new List<CurrentArticle>(); DBDataContext context = new DBDataContext(); var results = from D in context.MyContents where D.MVCURL.Contains(id) select D;foreach (var result in results) { CurrentArticle ca = new CurrentArticle(); ca.Title = result.Title; ca.Article = result.Article; ca.Summary = result.Summary; resultlist.Add(ca); } return resultlist;}

    Read the article

  • How to evaluate the quality of Rails code?

    - by Fortuity
    In a code review, what do you look for to assess a developer's expertise? Given an opportunity to look at a developer's work on a real-world project, what tell-tale signs are a tip-off to carelessness or lack of experience? Conversely, where do you look in the code to find evidence of a developer's skill or knowledge of best practices? For example, if I'm looking at a typical Rails app, I would be happy to see the developer is using RSpec (showing a commitment to using test-driven development and knowledge that RSpec is currently more popular than the default TestUnit). But in examining the specs for a Rails model, I see that the developer is testing associations, which might indicate a lack of real understanding of Rails testing requirements (since such tests are redundant given that they only test what's already implemented and tested in ActiveRecord). More generally, I might look to see if developers are writing their own implementations versus using widely available gems or if they are cleaning up code versus leaving lots of commented-out "leftovers." What helps you determine the skill of a Rails developer? What's your code quality checklist?

    Read the article

  • Coders For Charities

    - by Robz / Fervent Coder
    Last weekend I had the opportunity to give back to the community doing what I love. As geeks we don’t usually have this opportunity. The event is called Coders 4 Charities (C4C) and it’s a grueling weekend of coding for nearly 30 hours over the weekend. When you finish you get to present to the charity and all of the other groups what you have completed. From the site: Coders For Charities is a 3-day charity event that pairs charities and local software developers. Charities often do not have the funds to implement a new website or intranet or database solution. Software developers often do not volunteer for charities because their skills do not apply. This event is the perfect marriage of these two needs; software developers volunteering their time to help charities better serve their community though the latest technology! The actual event was lined with multiple charities and about 50 developers, designers, business analysts, etc, each working with a different charity to come up with a solution that they could implement in less than 3 days. C4C provided a place and food for us so that we wouldn’t have to leave much during the time we had to implement our solution. They also provided games like Rock Band so we could get away and clear our minds for a few moments if necessary. I don’t think we made it down there to play, but the food and drinks were a huge help for us. The charity we we picked was Harvest Home. They had a need for an online intranet site where they could track membership and gardening. Over the next few days we worked on a site we could give them. Below is a screen shot with private data marked out. It was an awesome and humbling experience to be able to give back to a charity and I’m happy I was a part of it. I would definitely do it again. How often do we get to use our abilities to volunteer our time to a charity?

    Read the article

  • Webcast - Oracle Database In-Memory Option

    - by Thanos Terentes Printzios
    Next to the recent announcement by Larry Ellison on the Future of the Database, we are happy to share this exclusive series of live webcasts from Oracle Database Product Management, where you can learn more about the brand new Oracle Database 12c In-Memory option. Oracle Database In-Memory is Oracle’s new memory-optimized technology that transparently accelerates analytic, data warehousing, and reporting workloads, while also accelerating transaction processing (OLTP) workloads. Participants will learn about Oracle Database In-Memory benefits, features, and leading edge architecture.  The Database In-Memory architecture provides the ability to easily process data orders of magnitude faster by simply enabling the feature and identifying tables to bring in-memory without application changes. Details on Oracle Database In-Memory’s ease of use and management, scalability, and availability will also be covered. Please join us to learn more about Oracle Database In-Memory and get first-hand knowledge of this important new feature. Delivery Format This FREE online LIVE eSeminar will be delivered over the Web.These Oracle webcasts are FREE for Customers, System Integrators, ISVs, VARs and Platform Partners. Presenter: Richard Jacobs, Oracle Solution Architect  Europe Webcast 1 Date: August 29, 2014 @ 10:00 am to 11:00 am Central European Summer Time (CEST)Register Here! Europe Webcast 2 Date: September 29, 2014 @ 10:00 am to 11:00 am Central European Summer Time (CEST)Register Here!

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

  • no-www redirect not working / DNS A record

    - by HonzaB
    I'm far from an expert on this, but I'll try to be as clear as possible. I have an eshop solution leased from a company and it's hosted on their server. I can access it thru company.com/myshop and it also allows me to set up to 3 domains that they should recognize and redirect to my specific shop from. I registered a domain with a different company and am trying to "redirect" it to the eshop. By means of one of the following DNS entries (as they look in the admin GUI) * A 111.111.111.111 *.myshop.com A 111.111.111.111 myshop.com A 111.111.111.111 I've managed to make www.myshop.com redirect to the IP of company.com (111.111.111.111), which then goes on to do exactly what I expect it to do (ie. recognizes it comes from my domain, does some further redirects itself). However, I can't seem to make myshop.com (ie. without the www) redirect to that IP, too. The company that I registered the domain with provides a "URL redirect" service, but google would only register the redirect request and wouldn't follow it. That's why I hope for a DNS solution to this - my assumption being I've managed to miss adding a record to the DNS; if, however, the reason lies elsewhere, I'd be happy to hear about that too. If it's a search engine friendly solution (ie the www/no-www dilemma - avoidance of double content penalties), then that's even better; have no prefference either way (www/no-www), just need it to work. Any help is greatly appreciated, thanks

    Read the article

  • Practical Approaches to increasing Virtualization Density-Part 1

    - by Girish Venkat
    Happy New year everyone!. Let me kick start the year off by talking about Virtualization density.  What is it?The number of virtual servers that a physical server can support and it's increase from the prior physical infrastructure as a percentage. Why is it important?This is important because the density should be indicative of how well the server is getting consumed?So what is wrong ?Virtualization density fails to convey the "Real usage" of a server.  Most of the hypervisor based O/S Virtualization  evangelists take pride in the fact that they are now running a Virtual Server farm of X machines compared to a Physical server farm of Y (with Y less than X obviously). The real question is - has your utilization of the server really increased or not.  In an internal study that was conducted by one of the top financial institution - the utilization of servers only went up by 15% from 30 to 45. So, this really means that just by increasing virtualization density one will not be achieving the goal of using up the servers in their server farm better.  I will write about what the possible approaches are to increase virtualization density in the next entry. 

    Read the article

  • MySQL Enterprise Monitor 2.3.12 Is Now Available!

    - by Andy Bang
    We are pleased to announce that MySQL Enterprise Monitor 2.3.12 is now available for download on the My Oracle Support (MOS) web site. It will also be available via the Oracle Software Delivery Cloud in approximately 1-2 weeks. This is a maintenance release that contains several new features and fixes a number of bugs. You can find more information on the contents of this release in the changelog: http://dev.mysql.com/doc/mysql-monitor/2.3/en/mem-news-2-3-12.html You will find binaries for the new release on My Oracle Support: https://support.oracle.com Choose the "Patches & Updates" tab, and then use the "Product or Family (Advanced Search)" feature. And from the Oracle Software Delivery Cloud (in about 1-2 weeks): http://edelivery.oracle.com/ Choose "MySQL Database" as the Product Pack and you will find the Enterprise Monitor along with other MySQL products. If you haven't looked at 2.3 recently, please do so now and let us know what you think. Thanks and Happy Monitoring! - The MySQL Enterprise Tools Development Team

    Read the article

  • Ubuntu: The Movie

    - by CYREX
    Since Ubuntu is the most popular distribution and has made a lot of changes in many places around the globe and in different industries up to the point where even people that do not know what Linux is, they know what Ubuntu is (go figure? ) there might be a movie coming someday (like the social network for Facebook or Revolution OS for Linux/Red Hat) i wanted to know how it all came to be from the actual players in the show. UBUNTU: The Movie Since i have seen several of the primary characters of the movie here, this might be a good place to start on how it all came to be. Not in the traditional wikipedia way or the ubuntu help section, but in the what the actual developers have in mind on how it all went down to the point of having a huge amount of users, an incredible level sophistication in the forum, help sections, installers, etc.. This is just to have the KNOW HOW before the actual movie makes it out some day in the future. As a fan of Ubuntu this is a MOST KNOW! ;) Hope i made some people happy and some other shy hehe.

    Read the article

  • A New Year’s Celebration in June

    - by Kristin Rose
    Happy Oracle New Year Everyone! Last week marked the official start to FY13 and we could not be more pleased with all that lies ahead this quarter, and all that we accomplished in the last…especially our newly updated Oracle PartnerNetwork (OPN) Solutions Catalog. If you thought it was great before, just wait until you see it now. We are ringing in our New Year right by fully equipping partners with the necessary tools they need to have another successful year. The Solutions Catalog will help draw attention to your partner services and offerings, highlighting your expertise. The Solutions Catalog is a centralized and easy way to navigate this customer friendly site. Some of the exciting advancements include: A streamlined search interface A robust lead capture tool that requests the contact information of potential customers A professional display of customer recommendations to showcase your skill set A partner dashboard with enhanced profile creation and an improved publication process Most exciting of all, updating your profile is easier than ever with the updated partner dashboard. Keeping your partner profile up to date will help to ensure customers are looking at the correct information about your company, and can easily stay on-top of any new developments or Specializations you receive. So don’t cut yourself short, be sure to update your profile today if you haven’t already done so. For more information on the exciting upgrades available to you, visit the ‘Resources for Partners’ page or watch Takane Aizeki, Principle Portal Manager at Oracle, walk through the upgraded Solutions Catalog and the different ways to showcase your value as an Oracle solution provider. Cheers,Lydia SmyersGroup Vice PresidentWWA&C and Communications

    Read the article

  • How to improve battery life on Samsung 13.3” Series 7 Ultra (NP730U3E-S01AU)?

    - by beam022
    Recently I've bought a Series 7 Ultra Samsung ultrabook and decided to change the OS from originally installed Windows 8 to Ubuntu 14.04LTS. However, it's difficult not to notice great decrease in battery life: on pre-installed Windows 8 battery would last for about 6 hours while on Ubuntu it's almost empty after 2 hours of same kind of work (wi-fi, web, vlc, spotify, intellij idea). I'm not here to say that Ubuntu's battery performance is worse than Windows, but to ask for suggestions how to improve the situation (2 hours of work is pretty poor battery life). Can you recommend some sources, applications or tips/tricks that would improve battery life on my ultrabook? I really like the Ubuntu experience, but this makes my machine much less reliable. I suspect that graphic video card might be one of the issues here. Let me give you tech specs of the ultrabook: Processor: Intel® Core™ i5 Processor 3337U (1.80GHz, 3MB L3 Cache) Chipset: Intel HM76 Graphic: AMD Radeon™ HD 8570M Graphics with 1GB gDDR3 Graphic Memory (PowerExpress) and Intel(R) HD Graphics 4000 Display: 13.3" SuperBright+ 350nit FHD LED Display (1920 x 1080), Anti-Reflective Memory: 10GB DDR3 System Memory at 1,600MHz Hard-drive: 128GB Solid-state Drive More informations here, on the official page. If it's helpful to provide additional info, I'm happy to do it, just let me know what you need. Thank you.

    Read the article

  • Improving SpriteBatch performance for tiles

    - by Richard Rast
    I realize this is a variation on what has got to be a common question, but after reading several (good answers) I'm no closer to a solution here. So here's my situation: I'm making a 2D game which has (among some other things) a tiled world, and so, drawing this world implies drawing a jillion tiles each frame (depending on resolution: it's roughly a 64x32 tile with some transparency). Now I want the user to be able to maximize the game (or fullscreen mode, actually, as its a bit more efficient) and instead of scaling textures (bleagh) this will just allow lots and lots of tiles to be shown at once. Which is great! But it turns out this makes upward of 2000 tiles on the screen each time, and this is framerate-limiting (I've commented out enough other parts of the game to make sure this is the bottleneck). It gets worse if I use multiple source rectangles on the same texture (I use a tilesheet; I believe changing textures entirely makes things worse), or if you tint the tiles, or whatever. So, the general question is this: What are some general methods for improving the drawing of thousands of repetitive sprites? Answers pertaining to XNA's SpriteBatch would be helpful but I'm equally happy with general theory. Also, any tricks pertaining to this situation in particular (drawing a tiled world efficiently) are also welcome. I really do want to draw all of them, though, and I need the SpriteMode.BackToFront to be active, because

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >