Search Results

Search found 28847 results on 1154 pages for 'project organization'.

Page 564/1154 | < Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >

  • Which paradigm to use for writing chess engine?

    - by poke
    If you were going to write a chess game engine, what programming paradigm would you use (OOP, procedural, etc) and why whould you choose it ? By chess engine, I mean the portion of a program that evaluates the current board and decides the computer's next move. I'm asking because I thought it might be fun to write a chess engine. Then it occured to me that I could use it as a project for learning functional programming. Then it occured to me that some problems aren't well suited to the functional paradigm. Then it occured to me that this might be good discussion fodder.

    Read the article

  • Deploying SSIS to Integration Services Catalog (SSISDB) via SQL Server Data Tools

    - by Kevin Shyr
    There are quite a few good articles/blogs on this.  For a straight forward deployment, read this (http://www.bibits.co/post/2012/08/23/SSIS-SQL-Server-2012-Project-Deployment.aspx).  For a more dynamic and comprehensive understanding about all the different settings, read part 1 (http://www.mssqltips.com/sqlservertip/2450/ssis-package-deployment-model-in-sql-server-2012-part-1-of-2/) and part 2 (http://www.mssqltips.com/sqlservertip/2451/ssis-package-deployment-model-in-sql-server-2012-part-2-of-2/) Microsoft official doc: http://technet.microsoft.com/en-us/library/hh213373 This only thing I would add is the following.  After your first deployment, you'll notice that the subsequent deployment skips the second step (go directly "Select Destination" and skipped "Select Source").  That's because after your initial deployment, a ispac file is created to track deployment.  If you decide to go back to "Select Source" and select SSIS catalog again, the deployment process will complete, but the packages will not be deployed.

    Read the article

  • Ring in the Holiday with Papercraft Star Wars Snowflakes

    - by Jason Fitzpatrick
    Whether your holiday decorating is begging for a geeky touch (or your nieces and nephews are begging for something to occupy their time while visiting this holiday), Anthony Herrera’s Star Wars themed paper snowflakes are a perfect geeky holiday project. This year’s collection includes Admiral Ackbar, A-Wings, B-Wings, Chewbacca, Ewoks, and more. Be sure to check out the 2011 and 2010 editions, for even more characters. Star Wars Snowflakes 2012 [Anthony Herrera Designs] Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It How To Delete, Move, or Rename Locked Files in Windows

    Read the article

  • Security issue about making my code public in GitHub

    - by John Doe
    I'm developing a big community/forum website and I'd like to upload my code to GitHub to have at least some sort of version control over it (because I have nothing other than a .rar file as a backup, not even SVN), to let others contribute to the project, and also perhaps using it to let my potential future employers see some of my code as some sort of curriculum. But what I'm wondering now, and I'm suprised I haven't seen anyone mention it before is the security aspect of it. Isn't publishing the code of a website a HUGE security hole? Is like giving a potential hacker or anyone who would like to find any potential exploit possible, even considering that the critical files aren't uploaded (database passwords, authentication scripts, etc.). Of course that there are millions of projects uploaded to GitHub and no one will find mine just 'by chance'. But if they look for it, it would indeed be there. Bottomline: my problem is not about copyright or licenses, but others finding exploits in my website. I'm I missing something here?

    Read the article

  • Permissions on mac for itunes library with multiple users - idea

    - by John
    I currently have a lot of music on an external drive and my itunes set up from there. However, periodically, when the external drive isn't connected, itunes will default back to the library location of my home directory user path. I don't want to mess with an external drive, as my mac HD is large enough to house the music collection. However, I have 4 family members - all with their own logins - using this same gob of music. I don't want 4 copies of the library, only one with all libraries referencing it. So, what I want to do is: 1 - move all music files to a shared directory at /Macintosh HD/users/music. I created this directory and adjusted permissions, so all four users can read and write to this directory. 2 - get all four accounts to reference this library instead of the external or local home locations I am hoping I can just check the box to keep library organized in my account, which is the admin and let itunes move it all. Then delete current libraries for each account and re-add from the new shared location. Will the itunes organization process cause permissions issues either by setting permissions to all the files access to my account only or write permissions or any other 'gotcha'? I am having a hard time coming up with a smooth solution that won't break everything and cause me to have mega duplicates or access issues. I would prefer not to do any xml library file editing if possible. Am I dreaming? Thanks for help.

    Read the article

  • Can I configure a visual difference view with the notifications provided by TFS?

    - by John Kaster
    I have TFS sending me alerts whenever someone on my team checks in code. (I had to create notification rules for every project, but that's just a sidebar complaint in this question.) These alerts provided some information on who checked in the files when, and what files have changed, with urls to view details in a browser. The thing that baffles me is that I can't just click on the source file and see a visual diff of the changes. There's no link that will auto-launch a diff in Visual Studio (using a custom protocol) from there either. Is there a way to configure TFS to provide a visual diff of the changes to the file that was checked in via this notification UI?

    Read the article

  • Is the term "web portal" obselete?

    - by John Hamelink
    Firstly, sorry if this is the wrong place: I've looked at all the programming-related boards and this one seems like the best fit - correct me if I'm wrong. My boss uses the term "portal" for the project I work on all the time. To me, the word makes me think of Yahoo in the late 90s. Does the word "portal" have old-school connotations, or is it just me? Do you think it's ok to use it, and not drag our client's perception of the product down into the middle-ages? Or again, am I just being weird?

    Read the article

  • VoIP setup for one external PSTN line

    - by Jcl
    I'm completely new to VoIP and the likes, and I'm trying to find information about what could be the best setup for this. I need 4 (maybe more in the future, but maximum 5 or 6) wireless extensions, connected to 1 PSTN line, and maybe 2 in the future. I've been trying to gather information about the gear needed but everything I find seems too much over-the-top (and extremely expensive). The main problem is that the physical place we are on doesn't have possibilities of having a decent internet connection, so using a external VoIP "virtual PBX" is not an option. Thing is, even if small, phone is critical to this organization. I currently have an analog DECT/GAP PBX which does what I need, however the PBX is very bad and the call quality is horrible, and that's why I want to change it. The requirements would be: 4 wireless terminals (routing cable is not an option), all of them ringing on incoming PSTN calls. Ability to do internal calls (4 separate offices) and ability to pass calls between terminals. The 4 terminals should be able to access the external PSTN line without dialing any special codes. Very important: terminals should be able to issue commands on the PSTN line to the external operator in the form *nn*nnnnnnnn# . Don't know wether this could face to be a problem, but I've had problems with analog PBX which would take any * as a PBX command and wouldn't allow terminals to send it to the external lines. Not so important, but would be nice to have: call waiting music Could anyone recommend such a setup? I need to be able to do this on a EXTREMELY LIMITED budget (that is: I don't have a limit, but all should get as much to zero as possible). I have enough spare powerful computers and a 300mbps wireless network which works just fine, so that's not to include in the budget. Don't really know if this is the best place to ask, but it's the most StackExchange-related site I've found to this subject.

    Read the article

  • Deploying workstations - best practices?

    - by V. Romanov
    Hi guys I've been researching on the subject of workstation deployment for a while, and found a ton of info and dozens different methods and tools, but no "best practice" method that doesn't lack at least one feature that i consider required for the solution to be perfect. I'm currently interested in windows workstation deployment, but if the tools can be extended to Linux, then it's an added value. I want the deployment tools I use to be able to do the following: hardware independent - I want my image or installation to have a minimum of hardware and driver dependency, so that i can use a single image/package for all workstations easily updatable - I want to be able to update my image as easily as possible without redeploying/rebuilding/reimaging all configurations PXE bootable deployment - I want the tools to be bootable off the network so that I don't need a boot cd/DOK. scriptable for minimum human input - Ideally, the tool should run automatically after being booted and perform a "default" deployment (including partitioning) unless prompted otherwise. i.e - take a pc, hook it up, power on, PXE boot and forget about it until the OS is deployed. I found no single product or environment that does all this. Closest i came to is the windows deployment services/WIM image format. I also checked out numerous imaging and deployment tools including clonezilla, ghost, g4u, wpkg and others, but most of them lack the hardware Independence and updatability features. We currently have a Symantec Ghost server setup that does imaging over the network, but I'm not satisfied with it as it has all the drawbacks i listed above. Do you have suggestions how to optimize the process of workstation deployment? How do you deploy them in your organization? Thanks! Vadim.

    Read the article

  • Importing csv list of contacts into Exchange 2007 GAL and create Distribution Group

    - by Ken Ray
    Here's the situation: We have a list of about 1,000 contacts (Lawyers in the area our court serves) with name and email address. I've been asked to create an email distribution list that can be used to sent emails to all of the external users on that list. I've seen various articles using the Exchange Management Shell and the Import-csv command piped through a ForEach-Object to a New-MailContact to set up the contacts. However, Exchange Management Shell is rather unhelpful, and it isn't working. What I believe I need to do is: 1) Set up a new distribution group using the Exchange Management Console. Let's say this new distribution group (which appears in the list of Distribution Groups under Recipient Configuration) is called "FloridaBar". 2) Make sure I have a csv file of the information I want to import. 3) Open Exchange Management Shell, and enter the following command: Import-csv C:\filename.csv | ForEach-Object { New-MailContact -Name $."NameColumnName" -ExternalEmailAddress $."EmailAddressColumn" -org FloridaBar Now, creating 1,000+ contacts in active directory - I assume that shouldn't be an issue. Do I have the "-org" parm wrong? Do I need to spell out the complete organization unit name (my.domain.name/Users/FloridaBar)? Is there a better way of doing this? Thanks in advance Ken

    Read the article

  • Android Cocos2DX using C++ in Eclipse Helios Windows XP

    - by 25061987
    I have used Eclipse Helios 3.6.1 for Java development. I wanted to start C++ development in the same IDE so I installed Autotools Support For CDT, C/C++ Development Tools, C/C++ Library API Documentation Hover Help plugins.I have included #include "cocos2d.h" in my HelloWorldScene.h file now when writing the below statement cocos2d::CCSprite * ccSprite; I am not getting auto completion bar(template proposals) on writing like coco and pressing Ctrl + Space from my keyboard. What can be the problem?This might help you solve my problem. Please check here. This is what I got after clicking Right Click Project - Index - Search for Unresolved Index. But I have added all includes check here. I think this is causing problem in Content Assist. What should I do in this case? Inclusion seems proper.

    Read the article

  • Motion Sensing Fog Machine Increases Savings and Spook Factor

    - by Jason Fitzpatrick
    This DIY add-on switches a standard fog machine from always-on to motion-activated–increase your savings and spook factor at the same time. Courtesy of tinker Greg, this modification involves a new relay and motion sensor mounted onto the existing switch of a store-bought fog machine. When the motion-sensor detects motion the fog machine releases a burst of fog for 5 seconds and then disarms itself for 10 seconds–long enough for the startled victim to move on and for the machine to recharge for the next passerby. Check out the video above to see it in action and then hit up the link below to see the project’s build guide. Motion Sensing Fog Machine Trigger [via Hack A Day] How Hackers Can Disguise Malicious Programs With Fake File Extensions Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer

    Read the article

  • copying folder and file permissions from one user to another after switching domains [closed]

    - by emptyspaces
    Please excuse the title, this was the best way I could think to describe this scenario without an entire paragraph. I am using C#. Currently I have a file server running windows server 2003 setup on a domain, we will call this oldDomain, and I have about 500 user accounts with various permissions on this server. Because of restrictions out of my control we are abandoning this domain and using another one that is more dominant within the organization, we will call this newDomain. All of the users that have accounts on oldDomain also have accounts on newDomain, but the usernames are completely different and there is no link between the two. What I am hoping to do is generate a list of all user accounts and this appropriate sid's from AD on the oldDomain, I already have this part done using dsquery and dsget. Then I will have someone go through and match all of the accounts from oldDomain to the correct username on newDomain. Ultimately leaving me with a list of sids from oldDomain and the appropriate username from newDomain. Now I am hoping to copy the file and folder permissions from the old user from oldDomain to the new user on newDomain once I join the server to newDomain. Can anyone tell me what the best way to copy permissions from the sid to the user on newDomain? There are a bunch of articles out there about copying permissions from user a to user b but I wanted to check and see what the recommended practice is here since there are a ton of directories.

    Read the article

  • IIS6 intranet site using integrated authentication fails to load when accessed externally

    - by maik
    I've developed a couple of internal sites for my organization that use integrated authentication. Ultimately we want these sites to be accessible externally to users with domain-joined computers. The sites work as expected on domain computers while on the internal network. The problem comes when I take my laptop home and try to access those sites. IIS only has integrated authentication enabled for the two sites. When I browse to the site using IE8 I get a username/password prompt asking for domain credentials. I can put those in and it will work, but the goal is to use the cached token for integrated authentication. Next I reasoned that IE wouldn't response to an integrated auth request (is NTLM the right term for this?) unless the site was trusted. I tried adding the site to Trusted Sites but I get the same behavior as the before. I then added the site to Local Intranet sites and that is where things get weird. I get a generic error page from IE, no error code or anything. Just for funsies I loaded up Firefox (which I had previously set up to use integrated authentication) and I added this new site to network.automatic-ntlm-auth.trusted-uris. Much to my surprise I was able to load the pages up with no problem at all and saw exactly what I was expecting (including verification that the integrated authentication worked). My mind is a bit boggled at the moment as I'm not really sure where to go from here. I was hoping some of you may be able to provide some insight.

    Read the article

  • New DevExpress Web.Config Settings

    Starting with DXperience v2010.1, were making a small and useful change. Were adding a new section to the web.config file for settings used by DevExpress ASP.NET controls. New Section Heres the default section that youll find at the bottom of a new web project using DXperience v2010.1 release: <devExpress> <compression enableHtmlCompression="false" enableCallbackCompression="true" enableResourceCompression="true" ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • [JOGL] My program is too slow, how can I profile with Eclipse?

    - by nkint
    My simple opengl program is really toooo slow and not fluid. I'm rendering 30 sphere with simple illumination and simple materials. The only complex computing stuff I do is a collision detection between ray-mouse and spheres (that works ok and i do it only in mouseMoved) I'm not using any threads, just an animator to move spheres. How can I profile my jogl project? Or maybe (most probable...) I have some opengl instructions that I don't understand and make render particular accurate (or back face rendering that I don't need or whatever I don't know exactly I'm just entering the opengl world)

    Read the article

  • Embedding the Silverlight version of the Open Media Player

    Im working on a Video Portal Application and have selected the Open Video Player for embedded viewing of videos. There are many video players out there but I selected this one becuase there are SilverLight and Flash versions in the project. Embedding is EASY ! Code Snippet <%@ Page Title="Home Page" Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="OpenPlayerSample._Default"...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • The Lost Episode of Cosmos: The Meat Planet

    - by Jason Fitzpatrick
    In the 1980s Carl Sagan captivated TV viewers with his exploration of the universe; we present to you, a lost episode, The Meat Planet. Creators of the parody video, Darren Cullen and Mark Tolson, engaged in some expert splicing and dicing of past Cosmos episodes to create their masterpiece: the lost episode focused on the fabled Meat Planet. Watch the episode above or hit up the link below for more information about the project. Meat Planet [via Boing Boing] How Hackers Can Disguise Malicious Programs With Fake File Extensions Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer

    Read the article

  • Visual Studio Load Testing using Windows Azure

    - by Tarun Arora
    In my opinion the biggest adoption barrier in performance testing on smaller projects is not the tooling but the high infrastructure and administration cost that comes with this phase of testing. Only if a reusable solution was possible and infrastructure management wasn’t as expensive, adoption would certainly spike. It certainly is possible if you bring Visual Studio and Windows Azure into the equation. It is possible to run your test rig in the cloud without getting tangled in SCVMM or Lab Management. All you need is an active Azure subscription, Windows Azure endpoint enabled developer workstation running visual studio ultimate on premise, windows azure endpoint enabled worker roles on azure compute instances set up to run as test controllers and test agents. My test rig is running SQL server 2012 and Visual Studio 2012 RC agents. The beauty is that the solution is reusable, you can open the azure project, change the subscription and certificate, click publish and *BOOM* in less than 15 minutes you could have your own test rig running in the cloud. In this blog post I intend to show you how you can use the power of Windows Azure to effectively abstract the administration cost of infrastructure management and lower the total cost of Load & Performance Testing. As a bonus, I will share a reusable solution that you can use to automate test rig creation for both VS 2010 agents as well as VS 2012 agents. Introduction The slide show below should help you under the high level details of what we are trying to achive... Leveraging Azure for Performance Testing View more PowerPoint from Avanade Scenario 1 – Running a Test Rig in Windows Azure To start off with the basics, in the first scenario I plan to discuss how to, - Automate deployment & configuration of Windows Azure Worker Roles for Test Controller and Test Agent - Automate deployment & configuration of SQL database on Test Controller on the Test Controller Worker Role - Scaling Test Agents on demand - Creating a Web Performance Test and a simple Load Test - Managing Test Controllers right from Visual Studio on Premise Developer Workstation - Viewing results of the Load Test - Cleaning up - Have the above work in the shape of a reusable solution for both VS2010 and VS2012 Test Rig Scenario 2 – The scaled out Test Rig and sharing data using SQL Azure A scaled out version of this implementation would involve running multiple test rigs running in the cloud, in this scenario I will show you how to sync the load test database from these distributed test rigs into one SQL Azure database using Azure sync. The selling point for this scenario is being able to collate the load test efforts from across the organization into one data store. - Deploy multiple test rigs using the reusable solution from scenario 1 - Set up and configure Windows Azure Sync - Test SQL Azure Load Test result database created as a result of Windows Azure Sync - Cleaning up - Have the above work in the shape of a reusable solution for both VS2010 and VS2012 Test Rig The Ingredients Though with an active MSDN ultimate subscription you would already have access to everything and more, you will essentially need the below to try out the scenarios, 1. Windows Azure Subscription 2. Windows Azure Storage – Blob Storage 3. Windows Azure Compute – Worker Role 4. SQL Azure Database 5. SQL Data Sync 6. Windows Azure Connect – End points 7. SQL 2012 Express or SQL 2008 R2 Express 8. Visual Studio All Agents 2012 or Visual Studio All Agents 2010 9. A developer workstation set up with Visual Studio 2012 – Ultimate or Visual Studio 2010 – Ultimate 10. Visual Studio Load Test Unlimited Virtual User Pack. Walkthrough To set up the test rig in the cloud, the test controller, test agent and SQL express installers need to be available when the worker role set up starts, the easiest and most efficient way is to pre upload the required software into Windows Azure Blob storage. SQL express, test controller and test agent expose various switches which we can take advantage of including the quiet install switch. Once all the 3 have been installed the test controller needs to be registered with the test agents and the SQL database needs to be associated to the test controller. By enabling Windows Azure connect on the machines in the cloud and the developer workstation on premise we successfully create a virtual network amongst the machines enabling 2 way communication. All of the above can be done programmatically, let’s see step by step how… Scenario 1 Video Walkthrough–Leveraging Windows Azure for performance Testing Scenario 2 Work in progress, watch this space for more… Solution If you are still reading and are interested in the solution, drop me an email with your windows live id. I’ll add you to my TFS preview project which has a re-usable solution for both VS 2010 and VS 2012 test rigs as well as guidance and demo performance tests.   Conclusion Other posts and resources available here. Possibilities…. Endless!

    Read the article

  • MD5 vertex skinning problem extending to multi-jointed skeleton (GPU Skinning)

    - by Soapy
    Currently I'm trying to implement GPU skinning in my project. So far I have achieved single joint translation and rotation, and multi-jointed translation. The problem arises when I try to rotate a multi-jointed skeleton. The image above shows the current progress. The left image shows how the model should deform. The middle image shows how it deforms in my project. The right shows a better deform (still not right) inverting a certain value, which I will explain below. The way I get my animation data is by exporting it to the MD5 format (MD5mesh for mesh data and MD5anim for animation data). When I come to parse the animation data, for each frame, I check if the bone has a parent, if not, the data is passed in as is from the MD5anim file. If it does have a parent, I transform the bones position by the parents orientation, and the add this with the parents translation. Then the parent and child orientations get concatenated. This is covered at this website. if (Parent < 0){ ... // Save this data without editing it } else { Math3::vec3 rpos; Math3::quat pq = Parent.Quaternion; Math3::quat pqi(pq); pqi.InvertUnitQuat(); pqi.Normalise(); Math3::quat::RotateVector3(rpos, pq, jv); Math3::vec3 npos(rpos + Parent.Pos); this->Translation = npos; Math3::quat nq = pq * jq; nq.Normalise(); this->Quaternion = nq; } And to achieve the image to the right, all I need to do is to change Math3::quat::RotateVector3(rpos, pq, jv); to Math3::quat::RotateVector3(rpos, pqi, jv);, why is that? And this is my skinning shader. SkinningShader.vert #version 330 core smooth out vec2 vVaryingTexCoords; smooth out vec3 vVaryingNormals; smooth out vec4 vWeightColor; uniform mat4 MV; uniform mat4 MVP; uniform mat4 Pallete[55]; uniform mat4 invBindPose[55]; layout(location = 0) in vec3 vPos; layout(location = 1) in vec2 vTexCoords; layout(location = 2) in vec3 vNormals; layout(location = 3) in int vSkeleton[4]; layout(location = 4) in vec3 vWeight; void main() { vec4 wpos = vec4(vPos, 1.0); vec4 norm = vec4(vNormals, 0.0); vec4 weight = vec4(vWeight, (1.0f-(vWeight[0] + vWeight[1] + vWeight[2]))); normalize(weight); mat4 BoneTransform; for(int i = 0; i < 4; i++) { if(vSkeleton[i] != -1) { if(i == 0) { // These are interchangable for some reason // BoneTransform = ((invBindPose[vSkeleton[i]] * Pallete[vSkeleton[i]]) * weight[i]); BoneTransform = ((Pallete[vSkeleton[i]] * invBindPose[vSkeleton[i]]) * weight[i]); } else { // These are interchangable for some reason // BoneTransform += ((invBindPose[vSkeleton[i]] * Pallete[vSkeleton[i]]) * weight[i]); BoneTransform += ((Pallete[vSkeleton[i]] * invBindPose[vSkeleton[i]]) * weight[i]); } } } wpos = BoneTransform * wpos; vWeightColor = weight; vVaryingTexCoords = vTexCoords; vVaryingNormals = normalize(vec3(vec4(vNormals, 0.0) * MV)); gl_Position = wpos * MVP; } The Pallete matrices are the matrices calculated using the above code (a rotation and translation matrix get created from the translation and quaternion). The invBindPose matrices are simply the inverted matrices created from the joints in the MD5mesh file. Update 1 I looked at GLM to compare the values I get with my own implementation. They turn out to be exactly the same. So now i'm checking if there's a problem with matrix creation... Update 2 Looked at GLM again to compare matrix creation using quaternions. Turns out that's not the problem either.

    Read the article

  • How do early version numbers work for new products?

    - by Lord Torgamus
    I'm currently writing a small desktop application for a friend, but I'm doing it primarily as a learning experience for myself. In the spirit of getting educated and doing things The Right Way, I want to have version numbers for this app. My research brought up these related results What "version naming convention" do you use? How do you version your files (Version Numbers) Forked a project, where do my version numbers start? but none of them address numbering of alphas, betas, release candidates, &c. What are the conventions for version numbers below 1.0? I know they can go on for some time; for example, PuTTY has been around for at least a decade and is still only at version beta 0.60.

    Read the article

  • Invoke WCF rest service razor mvc 4

    - by Raj Esh
    I have been using jQuery to access my REST based wcf service which does not export the meta information. Using ajax, i could populate data into controls. I need guidance and directions as to how i can use these Rest service in my controller. I can't add Service reference to my MVC 4 project since my WCF rest does not to expose Metadata. Should i use UNITY? or any other DI frameworks?. Any sample would be of great help.

    Read the article

  • In the Mobile and Tablet World, How Much is Too Much?

    - by andrewbrust
    The week of April 26th was a huge one in the world of mobile and tablet devices,  There were so many individual developments, announcements and solidifications of strategy, it’s almost impossible to believe they occurred in the same month, let alone the same week. Things started with Apple and Gizmodo having a Law and Order moment over the latter’s procurement of what appears to be the former’s 4th gen iPhone prototype.  We found out on the 26th that Gizmodo blogger Jason Chen’s apartment was raided by police and, honestly, that was a bit much. But Apple didn’t stop there.  They also published Steve Job’s critique of Adobe Flash and his explanation of Cupertino’s embargo of Flash on iPhones, iPods and iPads.  If you ask me, this too, was a bit much. Apple finished up the week by releasing the 3G version of its iPad product to the US market. I like (iLike?) my WiFi iPad.  The idea of getting a version of it that required a second 3G service monthly subscription, is, well, a bit  much. Microsoft was in the news too.  It killed a project it hadn’t even acknowledged the existence of: the Courier tablet.  That’s a bit much too.  If a tree falls in the woods, and Microsoft says they can’t hear it anyway, could they really have chopped it down? Maybe Microsoft Research should have licensed some of Courier’s technology from other parts of Microsoft.  Then maybe they could have kept the product alive.  Ask HTC: they’re going to be licensing technology from Microsoft because Redmond insists that Google’s Android operating system infringes on certain of their patents.  And since HTC now builds a number of handsets on Android, instead of being beholden, as they once were, to Windows Mobile, that means they can keep making their products.  Why does HTC have to pay the royalties, and not Google?  Maybe Microsoft decided that going after GOOG would have been a bit much, even for them. The agreement came not a moment to soon: HTC released their “Droid Incredible” (that name’s a bit much), an Android 2.1 handset with amazing hardware and HTC’s own Sense UI, on April 30th (this past Friday). This phone is very well-reviewed.  Maybe that’s why Google basically decided to beg off introducing a version of its Nexus One phone (also manufactured by HTC) on the Verizon Wireless network.  Google backing down?  That’s incredible, if not also a bit much. And that brings us to HP.  Which this week announced its acquisition of Palm and its webOS mobile phone touch-oriented operating system.  HP also killed its own Slate initiative.  Apparently HP realized that Windows 7, even with a proprietary HP touch UI added on top, is no match for the iPad.  I’m guessing they think webOS might work a bit better,  And I’m wondering if HP even wants to use webOS for phone handsets, beyond the Pre and Pixi.  Using it just for slate devices would be a bit extreme, but maybe not too much. Honestly, this was not Microsoft’s best week.  It killed a project and a close partner did likewise.  Then that same partner bought a competing OS product, while another partner released their new product that uses yet another competing OS platform. What did Microsoft actually produce this past week? An update to its Windows Phone 7 developer tools that actually works with the version of Visual Studio 2010 released on April 12th, and the version of Silverlight released three days later. That took three weeks to get synced up, and that’s a bit much too. But at least it happened. Windows Phone 7 is Microsoft’s best hope for a comeback in the SmartPhone market and to offer a credible touch-based tablet device.  This week, two of Microsoft’s slate initiatives died, and its only mobile phone victory was around its competitor’s operating system.  I hope the new platform gets Redmond out of the PC ghetto and into the classes of device that get people really excited today.  If it can’t, that would be a bit much; probably too much.  And, as the signs at the Lonestar Cafe in NYC used to say, too much ain’t enough.

    Read the article

  • Two more Entity Framework videos on Pluralsight

    Two new videos that I have created for Visual Studio 2010 have just been published to Pluralsight On-Demand. If you dont have a Pluralsight subscription (yet), these videos are available as part of PODs free guest pass along with lots of other great content. The new vids are Exploring the Classes Generated from an Entity Data Model and Consuming an Entity Data Model from a Separate .NET Project. Theyll be available on the MSDN Data Developer center as well (msdn.microsoft.com/data) in the very...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Software rendering 3d triangles in the proper order

    - by at.
    I'm implementing a basic 3d rendering engine in software (for education purposes, please don't mention to use an API). When I project a triangle from 3d to 2d coordinates, I draw the triangle. However, it's in a random order and so whatever gets drawn last draws on top of all other triangles (which might be in front of triangles it shouldn't be in front of)... Intuitively, seems I need to draw the triangles in the correct order. So I can calculate all their distances to the camera and sort by that. The objects furthest away get drawn last. Is this the proper way to render triangles? If I'm sorting all the objects, this is n*log(n) now. Is this the most efficient way to do this?

    Read the article

< Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >