Search Results

Search found 11901 results on 477 pages for 'triple store'.

Page 265/477 | < Previous Page | 261 262 263 264 265 266 267 268 269 270 271 272  | Next Page >

  • new project; entire node.js app

    - by Jared
    I have been looking into Node.js, express and Nowjs and love how easy it is to have real time interactions between clients. My background is mostly from CodeIgniter MVC using PHP and MYSql. I want to re make a current web project of mine from scratch to make everything better and more real time with this newer technology. After researching and doing test examples I want to use node.js , express and Nowjs for the real time interactions once someone connects to the socket.io to pull data back to clients. But use Code Igniter for the control of the site and user management , possible shopping cart/store , pretty much everything else. This is purely due to time constraints and that I am already familiar with doing it that way. I have been looking at MongoDB as an alternative to MySql, Basically the app is going to be multiple chat rooms all on one page. with the ability of notifications and private messaging. Lots of data transfer and images. before I started piecing it together I wanted to get people who have already done something similar. My model would use Code Igniter and MySQL to render the page and then connect them onto a node.js server and broadcast using express and nowjs would using a mongoDB be better than mySQL for tons of messages and data being stored or MYSQL? Also does it make since to not make the whole site on Node.js , kinda piece it together like that? I was asked to re post this somewhere else as it was not up to the format for SO, OP from here http://stackoverflow.com/questions/12649469/new-project-need-some-start-up-advice-node-js-app#comment17062924_12649469

    Read the article

  • Beginners Guide to Client Application Services

    - by mbcrump
    What is it? Client application services make it easy for you to create Windows-based applications that use the ASP.NET AJAX login, roles, and profile application services included in the Microsoft ASP.NET 2.0 AJAX Extensions. These services enable multiple Web and Windows-based applications to share user information and user-management functionality from a single server.   What can you do with it? Authenticate a user. You can use the authentication service to verify a user's identity. Determine the role or roles of an authenticated user. You can use the roles service to change the user interface of your application depending on the user's role. For example, you can provide additional features for users who are in an administrator role. Store and access per-user application settings located on the server. You can use the Web settings service (also known as the profile service) to share settings across multiple applications and locations. Client application services take advantage of the Web services extensibility model through client service providers that you can specify in your application configuration files. These service providers include offline functionality that uses a local cache for authentication, roles, and settings data when a network connection is unavailable. Give me an example of where I would use this! Sharing login and user role information between a Windows Form application and a ASP.NET application. How do I configure it? Click Here

    Read the article

  • 2D Collision masks for handling slopes

    - by JiminyCricket
    I've been looking at the example at: http://create.msdn.com/en-US/education/catalog/tutorial/collision_2d_perpixel and am trying to figure out how to adjust the sprite once a collision has been detected. As David suggested at XNA 4.0 2D sidescroller variable terrain heightmap for walking/collision, I made a few sensor points (feet, sides, bottom center, etc.) and can easily detect when these points actually collide with non-transparent portions of a second texture (simple slope). I'm having trouble with the algorithm of how I would actually adjust the sprite position based on a collision. Say I detect a collision with the slope at the sprite's right foot. How can I scan the slope texture data to find the Y position to place the sprite's foot so it is no longer inside the slope? The way it is stored as a 1D array in the example is a bit confusing, should I try to store the data as a 2D array instead? For test purposes, I'm thinking of just using the slope texture alpha itself as a primitive and easy collision mask (no grass bits or anything besides a simple non-linear slope). Then, as in the example, I find the coordinates of any collisions between the slope texture and the sprite's sensors and mark these special sensor collisions as having occurred. Finally, in the case of moving up a slope, I would scan for the first transparent pixel above (in the texture's Ys at that X) the right foot collision point and set that as the new height of the sprite. I'm a little unclear also on when I should make these adjustments. Collisions are checked on every game.update() so would I quickly change the position of the sprite before the next update is called? I also noticed several people mention that it's best to separate collision checks horizontally and vertically, why is that exactly? Open to any suggestions if this is an inefficient or inaccurate way of handling this. I wish MSDN had provided an example of something like this, I didn't know it would be so much more complex than NES Mario style pure box platforming!

    Read the article

  • How to get multiple open-source projects to use a standard way of doing something.

    - by Marco
    Problem In the last couple weeks, I've used 3 different "repository" tools (listed in alphabetical order): gradle ivy maven I'm calling them "repository" tools because I've also used sbt -- which fortunately uses ivy to manage it's cache or local repository. Each of these tools will create it's own repository. The defaults are: ~/.m2/repository for maven ~/.gradle/cache ~/.ivy2/cache Why can't they all use the same cache? Goal I'd like to change the world so that all three build tools could use the same cache. I'm looking for advice about issues I'm likely to run into and smart ways to get around them. By "use the same cache", I do not mean "retrieve from another build tool's cache". I mean "retrieve from and store in another build tool's cache". While I could go ahead and submit issues to the three projects, I know from experience (as a developer on an open source project), that if you want something done, you're best off getting it done yourself. Also, it seems like I need to get all 3 communities on board to some degree. What is the recommended approach for getting this kind of thing done? How do I approach the different communities? Do I work on patches for the 3 different projects, or would it be better off to create my own "interface" project that deals with these issues and have the 3 tools interface with that? Is this a standards question that I need to address on that front? Lastly, if I'm missing something and this is possible (in an globally configurable fashion), then please let me know.

    Read the article

  • Download Instagram photos of any user on Windows, Mac & Linux using 4K Stogram

    - by Gopinath
    Instagram is one of most popular mobile applications used to take pictures and share them online. Initially released for iOS devices, Instagram quickly won the hearts of photographers and made into top 10 apps charts in Apple App Store. With a simple interface and quick photo processing features Instagram popularity grown by leaps and bounds. Few months ago after releasing an Android application,  Facebook acquired Instagram for around 1 billion dollars. Undoubtedly Instagram is the most popular photo sharing app on mobile devices and it’s where you find best photographers share their beautiful pictures online. Being the best source of photographs many users love to browse and download photographs. For those who are looking for downloading photographs from Instagram here is an excellent free desktop application – 4K Stogram. Once installed 4K Stogram, you will be able to quickly browse and download photographs of any user by just entering user name in search box and clicking on Follow button. The app quickly searches for the recent photographs and displays the thumbnails as it downloads them. You can double click on any photograph to view it with your Operating Systems default photograph viewer. Open the location of the photograph and you will find the rest of downloaded photos in the same folder. Once you enter name of the user, 4K Stogram automatically keeps tracking the user until you manually delete it. Also it automatically downloads the latest photographs when its launched again. Like this you can track multiple users and download their photographs Here is a quick run down of 4K Stogram features View and download photos of Instagram users No need to have Instagram account, just enter user name and download all photos Track multiple users and download their photos Automatically download latest photos All new photos are marked with an indictor for quick reference Free and open source desktop application that runs on Windows, Mac & Linux Link to 4K Stogram Application website

    Read the article

  • An unexpected pleasure from Windows 8

    - by eddraper
    This post is certainly on the more nuanced side of all the goodness that is Windows 8, but it’s about something that’s really changed my PC usage experience for the better. Besides being a geek and the enjoying all the techno-thrills and chills that go along with sitting in front of a keyboard all day, I really love the forest.  Trees have always been special to me.  The feeling of being in the forest with all the sounds and ambiance, the broken light, the fragrance of the air… it’s paradise to me. As I can’t get there often, due to work, and quite often the heat here in Texas, I’ve found something that can at least partially fill the gap…  When you install Windows 8, you’ll have an app called “Naturespace” from http://www.naturespace.com/ .  It boasts a number of predefined loops in what they call “holographic audio.”  They’re essentially high-tech 3D sound fields recorded in natural environments. After checking them out, I really liked the sound of the “Daybreak” selection: A great benefit is that you don’t have to be in Metro/Modern/Windows App Store mode, in order to keep the sound playing.  To start the day, I click on Daybreak, start it, then go back to the desktop and fire up VS, Chrome, etc. As I work and play, I’m surrounded by this delightful background ambiance which relaxes me and puts my mind at ease. Give it a try.  I think you’ll like it.  And no, you don’t need ear buds or headphones to get the benefit.

    Read the article

  • What is a good support knowledge base tool?

    - by Guillaume
    I have been searching for a tool to help my team organize its knowledge for resolving recurring support cases. I know this question will probably be closed, but I'll try my change anyway because I know that I can have some good answers about that. Context: our team is developing and supporting an huge applications (lots of different screens and workflow processes. We already have a good tool for managing our documentation, but we are struggling with support cases. Support action involve often quite a lot of manual steps to fix stuff and the knowledge for these actions is more 'oral transmission' than modern tools. We need an efficient way to store them in a knowledge base to be able to retrieve similar cases based on patterns (a stacktrace, an error message, a component name, a workflow step, ...) and ranked by similarity. Our wiki search is not very powerful when it come to this kind of search and the team members don't want to 'waste' time writing a report that will never be found... Do you know efficient knowledge base tool for this kind of use case ?

    Read the article

  • Client side latency when using prediction

    - by Tips48
    I've implemented Client-Side prediction into my game, where when input is received by the client, it first sends it to the server and then acts upon it just as the server will, to reduce the appearance of lag. The problem is, the server is authoritative, so when the server sends back the position of the Entity to the client, it undo's the effect of the interpolation and creates a rubber-banding effect. For example: Client sends input to server - Client reacts on input - Server receives and reacts on input - Server sends back response - Client reaction is undone due to latency between server and client To solve this, I've decided to store the game state and input every tick in the client, and then when I receive a packet from the server, get the game state from when the packet was sent and simulate the game up to the current point. My questions: Won't this cause lag? If I'm receiving 20/30 EntityPositionPackets a second, that means I have to run 20-30 simulations of the game state. How do I sync the client and server tick? Currently, I'm sending the milli-second the packet was sent by the server, but I think it's adding too much complexity instead of just sending the tick. The problem with converting it to sending the tick is that I have no guarantee that the client and server are ticking at the same rate, for example if the client is an old-end PC.

    Read the article

  • Backup Failed, need help not failing

    - by Costa
    Backup (Deja Dup) failed to do an initial backup to my Amazon S3, and despite my best Googling efforts, I could not find a solution : ( Here's the error message: BackendException: Error uploading s3+http://deja-dup-auto-akiaiksyiqi3buaaz26a/My-Archive/duplicity-full.20130805T143807Z.vol1.difftar.gpg I'm running Ubuntu 12.04 LTS on a System76. The folder I'm trying to backup to in S3 is set to store as Amazon Glacier Storage. Any help would rock! Update: better debugging info: DUPLICITY: . Failed to create bucket (attempt #1) 'deja-dup-auto-axxxxxxxxxxxa' failed (reason: S3ResponseError: S3ResponseError: 403 Forbidden DUPLICITY: . <?xml version="1.0" encoding="UTF-8"?> DUPLICITY: . <Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><StringToSignBytes>47 00 54 0a 0a 0a 4d 6f 6e 00 20 30 35 20 00 75 67 20 32 30 31 00 00 35 00 32 34 3a 31 32 20 47 4d 00 0a 2f 64 65 6a 61 2d 64 75 70 2d 61 75 74 6f 2d 61 6b 69 00 6b 73 79 69 71 69 33 62 75 61 00 7a 32 36 61 2f</StringToSignBytes><RequestId>8000000000003</RequestId><HostId>Uxxxxxxxxxxxxxxxxx/xxxxxxxxxxxxxxxxRF</HostId><SignatureProvided>yxxxxxxxxxxxxxxxxxxxxx</SignatureProvided><StringToSign>GET

    Read the article

  • How can I perform 2D side-scroller collision checks in a tile-based map?

    - by bill
    I am trying to create a game where you have a player that can move horizontally and jump. It's kind of like Mario but it isn't a side scroller. I'm using a 2D array to implement a tile map. My problem is that I don't understand how to check for collisions using this implementation. After spending about two weeks thinking about it, I've got two possible solutions, but both of them have some problems. Let's say that my map is defined by the following tiles: 0 = sky 1 = player 2 = ground The data for the map itself might look like: 00000 10002 22022 For solution 1, I'd move the player (the 1) a complete tile and update the map directly. This make the collision easy because you can check if the player is touching the ground simply by looking at the tile directly below the player: // x and y are the tile coordinates of the player. The tile origin is the upper-left. if (grid[x][y+1] == 2){ // The player is standing on top of a ground tile. } The problem with this approach is that the player moves in discrete tile steps, so the animation isn't smooth. For solution 2, I thought about moving the player via pixel coordinates and not updating the tile map. This will make the animation much smoother because I have a smaller movement unit per frame. However, this means I can't really accurately store the player in the tile map because sometimes he would logically be between two tiles. But the bigger problem here is that I think the only way to check for collision is to use Java's intersection method, which means the player would need to be at least a single pixel "into" the ground to register collision, and that won't look good. How can I solve this problem?

    Read the article

  • Blogging from Office RT

    - by Dennis Vroegop
    During the last Build conference all attendees were given a brand new sparkling exciting Surface RT device (I love that machine despite its name but that's beside the point). On it came a version of Office 2013 RT, or better: the preview version. Now, I translated that term "Preview" to "Beta". Which is OK, since I've been using a lot of beta products from Microsoft and they all were great. And then I wanted to post a blogposting from Word. I knew I could, I have been doing this for a long time (I prefer Live Writer but that isn't available on Windows 8 RT). So I wrote the entry and hit "Publish". Instead of my blogsite I got a nice non-descriptive error telling me I couldn't post. So I fired up my other (Intel based) Win8 tablet, opened Word RT Preview, it loaded my blogpost (you've got to love the automatic synchronization through Skydrive) and tried from that machine. Same error. So, I installed Live Writer (remember, the other machine is Intel based) and posted from there. That worked like a charm. Apparently, there was something wrong with Word. I gave up and didn't think about it anymore. Yet… what you're reading now is written in Word 2013 RT on my Surface RT. So what did do? Simple: I updated from the Preview version to the final version. That's all there was to it. So…. If you're still on the preview I urge you to upgrade. You need to go to the "classic desktop update" window instead of going through the Windows Store App style update since Office is a desktop system, but once you do that you'll have the full version as well. Happy blogging!

    Read the article

  • Does the use of mongodb enhance extending/changing database driven applications?

    - by developer10214
    When an application is created which need to store data, an SQL database is used very often. So did I in a lot of asp.net applications. The resulting applications have often an ORM like the entity framework and maybe a business layer. So when such an application needs to be extended(let's say you have to add a comment property to an object), you have to change/extend the database, then the ORM and the business layer and so on. To deploy the changes you have to update the target database and the application. I know that things like code first and fluent can make this approach easier. I tried mongodb, I only used the standard driver and I had to extend some objects and all I had to do was changing the code. So it feels that such approaches are much easier to realize when using mongodb. I don't have much experience with larger applications an mongodb. I know that a SQL database or mongodb doesn't fit for all needs and both have their pros and cons. I want to know if my feeling is right, if yes I would choose rather choose mongodb than SQL database.

    Read the article

  • how to install 13.04 on a partitioned hardrive

    - by Denny
    First, not a computer literate person, not even a novice- so please use small words. I recently made the switch to ubuntu, it came preloaded on my new laptop that I order from a big tech dot com site. The version on it is 12.04 (i think) and 64bit. This system has a lot that I like but it is quirky for me to say the least. Apparently I have held broken packages and have no way of knowing how to find them. I discovered this when trying to download (from software center) VLC so that I could watch some movies I had on an external hard-drive. Unmet dependencies error and held broken package errors abound while trying to fix the problem. Ive scoured this site and other and followed almost all the suggestions to a T but still I am unable to fix anything. My computer is partitioned (but I don't even know how to get to the otherside so to speak). I would like to know; can I put the newer 13.04 OS on one side of the partition and then delete the older version on the other side? or, can I install 13.04 over the existing 12.04? What would I need to do this? An obstacle that I have is this, I am currently serving in Afghanistan so going someplace to buy something or running down to a computer store for service support is out of the question. I very much appreciate your help, cause right now this computer is nothing more than a word processor, which would be fine if all i wanted was a word processor. Thanks in advance.

    Read the article

  • What music player for Ubuntu is *most* like Cog?

    - by mattshepherd
    Cog is, far and away, my favourite music player. File browser on the left panel, pull a folder into the right panel, it plays the files you pull into the main box. Simple! Easy! Nice! Especially since I store all my music on my non-primary drive. Cog doesn't care where I keep my music! It just deals with whatever folder I tell it to monitor. I also have about 200 gigs of music. It's a problem, I know. I can't find a player this simple and elegant in Ubuntu. Amarok: doesn't handle non-primary drives very well. Banshee: kinda good, but fussy browsing/playlist systems. Audacious: no file browser; it also crashes whenever I try to "import" all my music. All I want -- seriously -- is something where I can look at my MP3 folder, drag a folder onto a playlist, and have that playlist play the songs.

    Read the article

  • IdentityServer Beta 1 Refresh &amp; Windows Azure Support

    - by Your DisplayName here!
    I just uploaded two new releases to Codeplex. IdentityServer B1 refresh A number of bug fixes and streamlined extensibility interfaces. Mostly a result of adding the Windows Azure support. Nothing has changed with regards to setup. Make sure you watch the intro video on the Codeplex site. IdentityServer B1 (Windows Azure Edition) I have now implemented all repositories for Windows Azure backed data storage. The default setup assumes you use the ASP.NET SQL membership provider database in SQL Azure and Table Storage for relying party, client certificates and delegation settings. The setup is simple: Upload your SSL/signing certificate via the portal Adjust the .cscfg file – you need to insert your storage account, certificate thumbprint and distinguished name There is a setup tool that can automatically insert the certificate distinguished names into your config file Adjust the connection string for the membership database in WebSite\configuration\connectionString.config Deploy Feedback Feature-wise this looks like the V1 release to me. It would be great if you could give me feedback, when you find a bug etc. – especially: Do the built-in repository implementations work for you (both on-premise and Azure)? Are the repository interfaces enough to add you own data store or feature?

    Read the article

  • Will a polled event system cause lag for a server?

    - by Milo
    I'm using a library called ENet. It is a reliable UDP library. The way it works is a polled event system like this: ENetEvent event; /* Wait up to 1000 milliseconds for an event. */ while (enet_host_service (client, & event, 1000) > 0) { switch (event.type) { case ENET_EVENT_TYPE_CONNECT: printf ("A new client connected from %x:%u.\n", event.peer -> address.host, event.peer -> address.port); /* Store any relevant client information here. */ event.peer -> data = "Client information"; break; case ENET_EVENT_TYPE_RECEIVE: printf ("A packet of length %u containing %s was received from %s on channel %u.\n", event.packet -> dataLength, event.packet -> data, event.peer -> data, event.channelID); /* Clean up the packet now that we're done using it. */ enet_packet_destroy (event.packet); break; case ENET_EVENT_TYPE_DISCONNECT: printf ("%s disconected.\n", event.peer -> data); /* Reset the peer's client information. */ event.peer -> data = NULL; } } It waits up to 1000 milliseconds for an event. If I'm hosting say 75 event driven card games and a lobby on the same thread as this code, will it cause any problems. If my understanding is correct, the process will simply sleep until there is an event, when there is one, it will process the event then come back here where potentially 5 or so events have queued up since so enet_host_services would return right away and not cause lag. I have been advised not to use multiple threads, will that be alright like this? Thanks

    Read the article

  • How Mature is Your Database Change Management Process?

    - by Ben Rees
    .dbd-banner p{ font-size:0.75em; padding:0 0 10px; margin:0 } .dbd-banner p span{ color:#675C6D; } .dbd-banner p:last-child{ padding:0; } @media ALL and (max-width:640px){ .dbd-banner{ background:#f0f0f0; padding:5px; color:#333; margin-top: 5px; } } -- Database Delivery Patterns & Practices Further Reading Organization and team processes How do you get your database schema changes live, on to your production system? As your team of developers and DBAs are working on the changes to the database to support your business-critical applications, how do these updates wend their way through from dev environments, possibly to QA, hopefully through pre-production and eventually to production in a controlled, reliable and repeatable way? In this article, I describe a model we use to try and understand the different stages that customers go through as their database change management processes mature, from the very basic and manual, through to advanced continuous delivery practices. I also provide a simple chart that will help you determine “How mature is our database change management process?” This process of managing changes to the database – which all of us who have worked in application/database development have had to deal with in one form or another – is sometimes known as Database Change Management (even if we’ve never used the term ourselves). And it’s a difficult process, often painfully so. Some developers take the approach of “I’ve no idea how my changes get live – I just write the stored procedures and add columns to the tables. It’s someone else’s problem to get this stuff live. I think we’ve got a DBA somewhere who deals with it – I don’t know, I’ve never met him/her”. I know I used to work that way. I worked that way because I assumed that making the updates to production was a trivial task – how hard can it be? Pause the application for half an hour in the middle of the night, copy over the changes to the app and the database, and switch it back on again? Voila! But somehow it never seemed that easy. And it certainly was never that easy for database changes. Why? Because you can’t just overwrite the old database with the new version. Databases have a state – more specifically 4Tb of critical data built up over the last 12 years of running your business, and if your quick hotfix happened to accidentally delete that 4Tb of data, then you’re “Looking for a new role” pretty quickly after the failed release. There are a lot of other reasons why a managed database change management process is important for organisations, besides job security, not least: Frequency of releases. Many business managers are feeling the pressure to get functionality out to their users sooner, quicker and more reliably. The new book (which I highly recommend) Lean Enterprise by Jez Humble, Barry O’Reilly and Joanne Molesky provides a great discussion on how many enterprises are having to move towards a leaner, more frequent release cycle to maintain their competitive advantage. It’s no longer acceptable to release once per year, leaving your customers waiting all year for changes they desperately need (and expect) Auditing and compliance. SOX, HIPAA and other compliance frameworks have demanded that companies implement proper processes for managing changes to their databases, whether managing schema changes, making sure that the data itself is being looked after correctly or other mechanisms that provide an audit trail of changes. We’ve found, at Red Gate that we have a very wide range of customers using every possible form of database change management imaginable. Everything from “Nothing – I just fix the schema on production from my laptop when things go wrong, and write it down in my notebook” to “A full Continuous Delivery process – any change made by a dev gets checked in and recorded, fully tested (including performance tests) before a (tested) release is made available to our Release Management system, ready for live deployment!”. And everything in between of course. Because of the vast number of customers using so many different approaches we found ourselves struggling to keep on top of what everyone was doing – struggling to identify patterns in customers’ behavior. This is useful for us, because we want to try and fit the products we have to different needs – different products are relevant to different customers and we waste everyone’s time (most notably, our customers’) if we’re suggesting products that aren’t appropriate for them. If someone visited a sports store, looking to embark on a new fitness program, and the store assistant suggested the latest $10,000 multi-gym, complete with multiple weights mechanisms, dumb-bells, pull-up bars and so on, then he’s likely to lose that customer. All he needed was a pair of running shoes! To solve this issue – in an attempt to simplify how we understand our customers and our offerings – we built a model. This is a an attempt at trying to classify our customers in to some sort of model or “Customer Maturity Framework” as we rather grandly term it, which somehow simplifies our understanding of what our customers are doing. The great statistician, George Box (amongst other things, the “Box” in the Box-Jenkins time series model) gave us the famous quote: “Essentially all models are wrong, but some are useful” We’ve taken this quote to heart – we know it’s a gross over-simplification of the real world of how users work with complex legacy and new database developments. Almost nobody precisely fits in to one of our categories. But we hope it’s useful and interesting. There are actually a number of similar models that exist for more general application delivery. We’ve found these from ThoughtWorks/Forrester, from InfoQ and others, and initially we tried just taking these models and replacing the word “application” for “database”. However, we hit a problem. From talking to our customers we know that users are far less further down the road of mature database change management than they are for application development. As a simple example, no application developer, who wants to keep his/her job would develop an application for an organisation without source controlling that code. Sure, he/she might not be using an advanced Gitflow branching methodology but they’ll certainly be making sure their code gets managed in a repo somewhere with all the benefits of history, auditing and so on. But this certainly isn’t the case (yet) for the database – a very large segment of the people we speak to have no source control set up for their databases whatsoever, even at the most basic level (for example, keeping change scripts in a source control system somewhere). By the way, if this is you, Red Gate has a great whitepaper here, on the barriers people face getting a source control process implemented at their organisations. This difference in maturity is the same as you move in to areas such as continuous integration (common amongst app developers, relatively rare for database developers) and automated release management (growing amongst app developers, very rare for the database). So, when we created the model we started from scratch and biased the levels of maturity towards what we actually see amongst our customers. But, what are these stages? And what level are you? The table below describes our definitions for four levels of maturity – Baseline, Beginner, Intermediate and Advanced. As I say, this is a model – you won’t fit any of these categories perfectly, but hopefully one will ring true more than others. We’ve also created a PDF with a flow chart to help you find which of these groups most closely matches your team:  Download the Database Delivery Maturity Framework PDF here   Level D1 – Baseline Work directly on live databases Sometimes work directly in production Generate manual scripts for releases. Sometimes use a product like SQL Compare or similar to do this Any tests that we might have are run manually Level D2 – Beginner Have some ad-hoc DB version control such as manually adding upgrade scripts to a version control system Attempt is made to keep production in sync with development environments There is some documentation and planning of manual deployments Some basic automated DB testing in process Level D3 – Intermediate The database is fully version-controlled with a product like Red Gate SQL Source Control or SSDT Database environments are managed Production environment schema is reproducible from the source control system There are some automated tests Have looked at using migration scripts for difficult database refactoring cases Level D4 – Advanced Using continuous integration for database changes Build, testing and deployment of DB changes carried out through a proper database release process Fully automated tests Production system is monitored for fast feedback to developers   Does this model reflect your team at all? Where are you on this journey? We’d be very interested in knowing how you get on. We’re doing a lot of work at the moment, at Red Gate, trying to help people progress through these stages. For example, if you’re currently not source controlling your database, then this is a natural next step. If you are already source controlling your database, what about the next stage – continuous integration and automated release management? To help understand these issues, there’s a summary of the Red Gate Database Delivery learning program on our site, alongside a Patterns and Practices library here on Simple-Talk and a Training Academy section on our documentation site to help you get up and running with the tools you need to progress. All feedback is welcome and it would be great to hear where you find yourself on this journey! This article is part of our database delivery patterns & practices series on Simple Talk. Find more articles for version control, automated testing, continuous integration & deployment.

    Read the article

  • CodePlex Daily Summary for Saturday, October 19, 2013

    CodePlex Daily Summary for Saturday, October 19, 2013Popular ReleasesMedia Companion: Media Companion MC3.583b: As before release but fixed for no movie poster sourcesNew* Both - Added 'An' as option to ignore in title * Movie - Renaming - added %Z - Sorttitle to Legend * Movie - Renaming - added %O - Audio Channels to Legend * Movie - Remove a poster source from priority list. Reset List back to defaults. * Made Media Companion truly portable application. Fixed* Movie - browse for Poster Or Fanart, allows for jpg, tbn, png and bmp images * Movie - Alt Fanart Browser - Url or Browse window now fully...MoreTerra (Terraria World Viewer): MoreTerra 1.11.3.1: Release 1.11.3.1 ================ = New Features = ================ Added markers for Copper Cache, Silver Cache and the Enchanted Sword. ============= = Bug Fixes = ============= Use Official Colors now no longer tries to change the Draw Wires option instead. World reading was breaking for people with a stock 1.2 Terraria version. Changed world name reading so it does not crash the program if you load MoreTerra while Terraria is saving the world. =================== = Feature Removal = =...patterns & practices - Windows Azure Guidance: Cloud Design Patterns: 1st drop of Cloud Design Patterns project. It contains 14 patterns with 6 related guidance.Json.NET: Json.NET 5.0 Release 8: Fix - Fixed not writing string quotes when QuoteName is falsePowerShell Community Extensions: 3.1 Production: PowerShell Community Extensions 3.1 Release NotesOct 17, 2013 This version of PSCX supports Windows PowerShell 3.0 and 4.0 See the ReleaseNotes.txt download above for more information.SQL Power Doc: Version 1.0.2.1: Misc. bug fixes Added logic to resolve members of a Windows Group server login Added columns to Excel workbooks to show definitions for server permissions, server roles, database permissions, and database rolesSocial Network Importer for NodeXL: SocialNetImporter(v.1.9): This new version includes: - Download latest status update and use it as vertex tooltip - Limit the timelines to parse to me, my friends or both - Fixed some reported bugs about the fan page and group importer - Fixed the login bug reported latelyDotNetNuke® Wiki: 05.00.00: Changes made to better support upgrades and the removal of deprecated legacy files that were causing formatting issues. Updated the Version number to better indicate the significance of the C# migration and the new DNN 7.0.2 minimum requirement.TerrariViewer: TerrariViewer v7.1 [Terraria Inventory Editor]: You can now backspace in number fields Items added in 1.2.0.3 no longer corrupt player files Buff durations capped at 9999999 Item stacks capped at 9999999 Version info added Prefix IDs corrected Shoe and Eye color box are now properly clickable Moved Bank and Safe into their own tab Users will now be notified of new updatesPython Tools for Visual Studio: 2.0: PTVS 2.0 We’re pleased to announce the release of Python Tools for Visual Studio 2.0 RTM. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, IPython, and cross platform and cross language debugging support. QUICK VIDEO OVERVIEW For a quick overview of the general IDE experience, please watch this v...C# Intellisense for Notepad++: Release v.1.0.8.2: Solved scrolling problem after DocumentFormatting Implemented "format as you type" --- To avoid the DLLs getting locked by OS use MSI file for the installation.CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.8.2: Solved scrolling problem after DocumentFormatting Implemented "format as you type" --- To avoid the DLLs getting locked by OS use MSI file for the installation.Collection Commander for Configuration Manager 2012: CMCollCtr 1.0.0: Change log: - MSI Setup - UI Improved - CM12 Console integration - New Powershell code snippets - Client Center IntegrationLINQ to Twitter: LINQ to Twitter v2.1.09: Supports .NET 3.5, .NET 4.0, .NET 4.5, Silverlight 4.0, Windows Phone 7.1, Windows Phone 8, Client Profile, Windows 8, and Windows Azure. 100% Twitter API coverage. Also supports Twitter API v1.1! Also on NuGet.Sandcastle Help File Builder: SHFB v1.9.8.0 with Visual Studio Package: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This new release contains bug fixes and feature enhancements. There are some potential breaking changes in this release as some features of the Help File Builder have been moved into...C++ REST SDK (codename "Casablanca"): C++ REST SDK 1.3.0: This release fixes multiple customer reported issues as well as the following: Full support for Dev12 binaries and project files Full support for Windows XP New sample highlighting the Client and Server APIs : BlackJack Expose underlying native handle to set custom options on http_client Improvements to Listener Library Note: Dev10 binaries have been dropped as of this release, however the Dev10 project files are still available in the Source CodeAD ACL Scanner: 1.3.2: Minor bug fixed: Powershell 4.0 will report: Select—Object: Parameter cannot be processed because the parameter name p is ambiguous.Fast YouTube Downloader: YouTube Downloader 2.2.0: YouTube Downloader 2.2.0VidCoder: 1.5.8 Beta: Added hardware acceleration options: Bicubic OpenCL scaling algorithm, QSV decoding/encoding and DXVA decoding. Updated HandBrake core to SVN 5834. Updated VidCoder setup icon. Fixed crash when choosing the mp4v2 container on x86 and opening on x64. Warning: the hardware acceleration features require specific hardware or file types to work correctly: QSV: Need an Intel processor that supports Quick Sync Video encoding, with a monitor hooked up to the Intel HD Graphics output and the lat...ASP.net MVC Awesome - jQuery Ajax Helpers: 3.5.2: version 3.5.2 - fix for setting single value to multivalue controls - datepicker min max date offset fix - html encoding for keys fix - enable Column.ClientFormatFunc to be a function call that will return a function version 3.5.1 - fixed html attributes rendering - fixed loading animation rendering - css improvements version 3.5 ========================== - autosize for all popups ( can be turned off by calling in js awe.autoSize = false ) - added Parent, Paremeter extensions ...New Projectsag2: ag2ApplicationMobile: ApplicationMobileBing Maps/Geolocalization Windows Store Template: Template de aplicativo para Windows 8 utilizando APIs do Bing Maps e Geolocalização. C3D.NET: C3D.NET is a free class library for manipulating C3D file. This project supports .NET 2.0 and .NET 4.0, and provides a free Data Viewer based on this library.DrinksMachineCMC: DrinksMachineCMCFerdeenFatalSpeech: Speech recognition for PC control.FuzzyLogic: FuzzyLogic Demohousekeeping: ?????isolutions Techdays 2013: Zeigt die Möglichkeit des Zugriffes auf ein CRM 2013 Online mit einer Windows Store App und aus einem Word Plugin.Learn .NET Gadgeteer: .NET Gadgeteer example projects using the FEZ Cerberus kit from GHILIT.Logger.ServerChecker: Small application for server monitoring, which demonstrates capabilities of LIT.Logger (www.litlogger.com)LuaTools for Visual Studio: luatools lua visual studioMTM Test Plan Viewer: A UI tool that lets you easily view, explore and export MTM test results.NetworkScan: Network scan windowsObservable Linq: Observable Linq and Observable Expressions is about to change the paradigm how to use expressions and Linq, providing update events whenever the results change.PatientManagement: beep beeppescar-shop-El-enchufe-khristo-grecia-giselle: falta la aplicacion de LINQ...ProjetoEscola: Projeto Integrado Senac Proyecto Librería PRISA: Este sistema servirá para controlar las compras, ventas y almacén de librería "PRISA". Se desarrollará en Visual Studio 2012 con patrón MVC y en base a capas.TDataBase: Simple db query libraryzongtest: test

    Read the article

  • CodePlex Daily Summary for Tuesday, December 18, 2012

    CodePlex Daily Summary for Tuesday, December 18, 2012Popular Releasessb0t v.5: sb0t 5 Template for Visual Studio: This is the official sb0t 5 template for Visual Studio 2010 and Visual Studio 2012 for C# programmers. Use this template to create your own sb0t 5 extensions.F# PowerPack with F# Compiler Source Drops: PowerPack for FSharp 3.0 + .NET 4.x + VS2010: This is a release of the old FSharp Power Pack binaries recompiled for F# 3.0, .NET 4.0/4.5 and Silveright 5. NOTE: This is for F# 3.0 & .NET 4.0 or F# 3.0 & Silverlight 5 NOTE: The assemblies are no longer strong named NOTE: The assemblies are not added to the GAC NOTE: In some cases functionality overlaps with F# 3.0, e.g. SI Units of measurecodeSHOW: codeSHOW AppPackage Release 16: This release is a package file built out of Visual Studio that you can side load onto your machine if for some reason you don't have access to the Windows Store. To install it, just unzip and run the .ps1 file using PowerShell (right click | run with PowerShell). Attention: if you want to download the source code for codeSHOW, you're in the wrong place. You need to go to the SOURCE CODE tab.Move Mouse: Move Mouse 2.5.3: FIXED - Issue where it errors on load if the screen saver interval is over 333 minutes.LINUX????????: LINUX????????: LINUX????????cnbeta: cnbeta: cnbetaCSDN ??: CSDN??????: CSDN??????PowerShell Community Extensions: 2.1.1 Production: PowerShell Community Extensions 2.1.1 Release NotesDec 16, 2012 This version of PSCX supports both Windows PowerShell 2.0 and 3.0. Bug fix for HelpUri error with the Get-Help proxy command. See the ReleaseNotes.txt download above for more information.VidCoder: 1.4.11 Beta: Added Hungarian translation, thanks to Brechler Zsolt. Update HandBrake core to SVN 5098. This update should fix crashes on some files. Updated the enqueue split button to fit in better with the active Windows theme. Updated presets to use x264 preset/profile/level.BarbaTunnel: BarbaTunnel 6.0: Check Version History for more information about this release.???: Cnblogs: CNBLOGSSandcastle Help File Builder: SHFB v1.9.6.0 with Visual Studio Package: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This new release contains bug fixes and feature enhancements. There are some potential breaking changes in this release as some features of the Help File Builder have been moved into...Electricity, Gas and Temperature Monitoring with Netduino Plus: V1.0.1 Netduino Plus Monitoring: This is the first stable release from the Netduino Plus Monitoring program. Bugfixing The code is enhanced at some places in respect to the V0.6.1 version There is a possibility to add multiple S0 meters Website for realtime display of data Website for configuring the Netduino Comments are welcome! Additions will not be made to this version. This is the first and last major Netduino Plus V1 release. The new development will take place with the Netduino Plus V2 development board in mi...CRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1116.8): [FIX] Fixed issue not displaying CRM system button images correctly (incorrect path in file VisualRibbonEditor.exe.config)My Expenses Windows Store LOB App Demo: My Expenses Version 1: This is version 1 of the MyExpenses Windows 8 line of business demo app. The app is written in XAML and C#. It calls a WCF service that works with a SQL Server database. The app uses the Callisto toolkit. You can get it at https://github.com/timheuer/callisto. The Expenses.sql file contains the SQL to create the Expenses database. The ExpensesWCFService.zip file contains the WCF service, also written in C#. You should create a WCF service. Create an Entity Framework model and point it to...BlackJumboDog: Ver5.7.4: 2012.12.13 Ver5.7.4 (1)Web???????、???????????????????????????????????????????VFPX: ssClasses A1.0: My initial release. See https://vfpx.codeplex.com/wikipage?title=ssClasses&referringTitle=Home for a brief description of what is inside this releaseLayered Architecture Solution Guidance (LASG): LASG 1.0.0.8 for Visual Studio 2012: PRE-REQUISITES Open GAX (Please install Oct 4, 2012 version) Microsoft® System CLR Types for Microsoft® SQL Server® 2012 Microsoft® SQL Server® 2012 Shared Management Objects Microsoft Enterprise Library 5.0 (for the generated code) Windows Azure SDK (for layered cloud applications) Silverlight 5 SDK (for Silverlight applications) THE RELEASE This release only works on Visual Studio 2012. Known Issue If you choose the Database project, the solution unfolding time will be slow....Fiskalizacija za developere: FiskalizacijaDev 2.0: Prva prava produkcijska verzija - Zakon je tu, ova je verzija uskladena sa trenutno važecom Tehnickom specifikacijom (v1.2. od 04.12.2012.) i spremna je za produkcijsko korištenje. Verzije iza ove ce ovisiti o naknadnim izmjenama Zakona i/ili Tehnicke specifikacije, odnosno, o eventualnim greškama u radu/zahtjevima community-a za novim feature-ima. Novosti u v2.0 su: - That assembly does not allow partially trusted callers (http://fiskalizacija.codeplex.com/workitem/699) - scheme IznosType...Bootstrap Helpers: Version 1: First releaseNew ProjectsAsh Launcher: The ash launcher is a launcher for the ash modpack for minecraftAsynchronous PowerShell Module: psasync is a PowerShell module containing simple helper functions to allow for multi-threaded operations using Runspaces. ClearVss: ClearVss clear any reference of Visual SourceSafe in yours solution. You'll no longer have to delete and modify solution files by hand. It's developed in C#dewitcher Framework: A rly cool Framework, made for use with COSMOS. - Console-class that supports colored, horizontal- and vertical- centered text printing - more =PGTD Pad: Notepad for GTD TechniqueHTML/JavaScript Rendering Web Part: HTML/JavaScript Render Web Part - replaces the SharePoint Content Editor Web Part (CEWP). Permits flying in script, HTML, etc. to a SharePoint Page.Kracken Generator and Architecture Tool for Visual Studio 2012: Welcome to Kracken a suite of tools for creating code from Architecture models. This program is the pet project of Tracy Rooks.Logica101: Aplicación para visualización de procesos algebraicosManaged DismApi Wrapper: This is a managed wrapper for the native Deployment Image Servicing and Management (DISM) API. This allows .NET developers to call into the native API instead Mvc web-ajax: A Javascript library for displaying lists and editing objects N2F Yverdon InfoBoxes: A simple extension (plus some resources) for managing information boxes on a given page.N2F Yverdon Solar Flare Reflector: The solar flare reflector provides minimal base-range protection for your N2F Yverdon installation against solar flare interference.newplay: goodPhnyx: The project is under re-structuring - look backPigeonCms: Cms made with c# (using NET framework 3.5, SqlServer2005 or Express edition) with many Joomla-like features. Poppæa: Very early version of a c# library for cassandra nosql database. For now it adds support for the new CQL3 protocol in cassandra 1.2+. Proximity Tapper: Proximity Tapper is a developer tool for working with NFC on both Windows Phone and Windows, and allows you to build NFC apps in the Windows Phone emulator.SharePoint 2010 SpellCheck: SharePoint 2010 SpellCheck Project will let you enable spelling check functionality in SharePoint 2010 using SpellCheck.asmxShift8Read Community Credit Tool (DotNetNuke Module): A simple DotNetNuke Module I created for submitting content to Community-Credit.comTempistGamer Client: The tempistgamers game client manager. Includes a cross-platform, cross-game UI to allow for player interactivity. While the program itself manages the games.testtom12122012tfs02: fdstesttom12172012tfs02: uioTEviewer: The Transient Event Viewer is an application designed for visualization and analysis of transient events.trucho-JCI: summaryTypeSharp: TypeSharp is a C# to TypeScript code generatorWeiboWPSdk: To make wp applications about sina weibo more easily.XNA Games Core: XNA Core Game Programming library. Uses interfaces and delegates, in tune with the .NET way, with inheritence used for implementation reuse.

    Read the article

  • SharePoint 2010 PowerShell Script to Find All SPShellAdmins with Database Name

    - by Brian Jackett
    Problem     Yesterday on Twitter my friend @cacallahan asked for some help on how she could get all SharePoint 2010 SPShellAdmin users and the associated database name.  I spent a few minutes and wrote up a script that gets this information and decided I’d post it here for others to enjoy.     Background     The Get-SPShellAdmin commandlet returns a listing of SPShellAdmins for the given database Id you pass in, or the farm configuration database by default.  For those unfamiliar, SPShellAdmin access is necessary for non-admin users to run PowerShell commands against a SharePoint 2010 farm (content and configuration databases specifically).  Click here to read an excellent guest post article my friend John Ferringer (twitter) wrote on the Hey Scripting Guy! blog regarding granting SPShellAdmin access.  Solution     Below is the script I wrote (formatted for space and to include comments) to provide the information needed. Click here to download the script.   # declare a hashtable to store results $results = @{}   # fetch databases (only configuration and content DBs are needed) $databasesToQuery = Get-SPDatabase | Where {$_.Type -eq 'Configuration Database' -or $_.Type -eq 'Content Database'}   # for each database get spshelladmins and add db name and username to result $databasesToQuery | ForEach-Object {$dbName = $_.Name; Get-SPShellAdmin -database $_.id | ForEach-Object {$results.Add($dbName, $_.username)}}   # sort results by db name and pipe to table with auto sizing of col width $results.GetEnumerator() | Sort-Object -Property Name | ft -AutoSize     Conclusion     In this post I provided a script that outputs all of the SPShellAdmin users and the associated database names in a SharePoint 2010 farm.  Funny enough it actually took me longer to boot up my dev VM and PowerShell (~3 mins) than it did to write the first working draft of the script (~2 mins).  Feel free to use this script and modify as needed, just be sure to give credit back to the original author.  Let me know if you have any questions or comments.  Enjoy!         -Frog Out   Links PowerShell Hashtables http://technet.microsoft.com/en-us/library/ee692803.aspx SPShellAdmin Access Explained http://blogs.technet.com/b/heyscriptingguy/archive/2010/07/06/hey-scripting-guy-tell-me-about-permissions-for-using-windows-powershell-2-0-cmdlets-with-sharepoint-2010.aspx

    Read the article

  • An alternative to a video codec for storing motion changes [on hold]

    - by Andrew Simpson
    I have a 3 dimensional byte array. The 3-d array represents a jpeg image. Each channel/array represents part of the RGB spectrum. I am not interested in retaining black pixels. A black pixel is represented by this atypical arrangement: myarray[0,0,0] =0; myarray[0,0,1] =0; myarray[0,0,2] =0; So, I have flattened this 3d array out to a 1d array by doing this byte[] AFlatArray = new byte[width x height x 3] and then assigning values respective to the coordinate. But like I said I do not want black pixels. So this array has to only contain color pixels with the x,y coordinate. The result I want is to re-represent the image from the i dimension byte array that only contains non-black pixels. How do I do that? It looks like I have to store black pixels as well because of the xy coordinate system. I have tried writing to a binary file but the size of that file is greater than the jpeg file as the jpeg file is compressed. I am using c#.

    Read the article

  • Oracle Internet Directory 11gR1 11.1.1.6 Certified with Oracle E-Business Suite

    - by B Shashikumar
    We are very pleased to announce that Oracle Internet Directory 11gR1 (11.1.1.6) is now certified with Oracle E-Business Suite Releases 11i, 12.0 and 12.1. With this certification, we are offering several benefits to Oracle E-Business Suite customers: · Massive Scale: Oracle Internet Directory (OID) is a proven solution for mission critical deployments. OID can scale to extremely large deployments on less hardware as demonstrated by its published Two-Billion-User Benchmark. This reduces the footprint required to deploy enterprise directory services in the data-center resulting in cost savings and a greener enterprise. · Enhanced Security: OID is the most secure directory service that provides security at every level from data in transit to storage and backups. In addition to LDAP security, it leverages powerful Oracle database security features like Database Vault and Transparent Data Encryption · Investment Protection: This certification leverages Identity Management’s hot-pluggable capabilities enabling E-Business Suite customers to store and manage user identities in existing directory servers thus helping them maximize their investments For a complete matrix of platforms supported by Oracle Internet Directory and its components, refer to the Oracle Identity and Access Management 11gR1 certification matrix. For more information about this certification, check out the Oracle E-Business Suite blog. 

    Read the article

  • Stuck With grub rescue> console

    - by rej santos
    Here is the background of my issue: I just recently installed the latest version of Ubuntu along side with Windows 8 Enterprise. However upon checking the disk size, it seems that some of the portions of the hd memory were gone so I decided to check the disk partition and have seen that is was being used by another file system. Thinking that the Ubuntu takes it boots stuff from my drive C: , I deleted that partition and formated so I can use it to store some movies, music etc. Now, as I switch on my machine, I am stucked with: error: unknown filesystem grub rescue> I googled a lot and saw the following command which seems no to me like sudo, chainloader etc, all of these command only returns unknown command in the console. What I just wanted is to boot from my Windows 8 OS. Just to add, I can't open the BIOS menu so I could choose what media to boot. As I open my machine it automatically takes me to grub rescue console. Here are the thing I already have: Ubuntu Installation Disk Windows 8 System repair Disk I just don't know how to boot into these things. Let me know what to do.

    Read the article

  • New SQLOS features in SQL Server 2012

    - by SQLOS Team
    Here's a quick summary of SQLOS feature enhancements going into SQL Server 2012. Most of these are already in the CTP3 pre-release, except for the Resource Governor enhancements which will be in the release candidate. We've blogged about a couple of these items before. I plan to add detail. Let me know which ones you'd like to see more on: - Memory Manager Redesign: Predictable sizing and governing SQL memory consumption: sp_configure ‘max server memory’ now limits all memory committed by SQL ServerResource Governor governs all SQL memory consumption (other than special cases like buffer pool) Improved scalability of complex queries and operations that make >8K allocations Improved CPU and NUMA locality for memory accesses Single memory manager that handles page allocations of all sizes Consistent Out-of-memory handling & management across different internal components - Optimized Memory Broker for Column Store indexes (Project Apollo) - Resource Governor Support larger scale multi-tenancy by increasing Max. number of resource pools20 -> 64 [for 64-bit] Enable predictable chargeback and isolation by adding a hard cap on CPU usage Enable vertical isolation of machine resources Resource pools can be affinitized to individual or groups of schedulers or to NUMA nodes New DMV for resource pool affinity  - CLR 4 support, adds .NET Framework 4 advantages - sp_server_dianostics Captures diagnostic data and health information about SQL Server to detect potential failures Analyze internal system state Reliable when nothing else is working   - New SQLOS DMVs (in 2008 R2SP1) SQL Server related configuration - New DMVsys.dm_server_services OS related resource configurationNew DMVssys.dm_os_volume_statssys.dm_os_windows_infosys.dm_server_registry XEvents for SQL and OS related Perfmon counters Extend sys.dm_os_sys_info See previous blog posts here and here. - Scale / Mission critical Increased scalability: Support Windows 8 max memory and logical processorsDynamic Memory support in Standard Edition - Hot-Add Memory enabled when virtualized - Various Tier1 Performance Improvements, including reduced instructions for superlatches. Originally posted at http://blogs.msdn.com/b/sqlosteam/

    Read the article

  • Issue tracking multiple domains with Google Analytics

    - by user359650
    I have 2 domains mydomain.com and mydomain.net which I'm trying to track with the same GA code. Here are the options I turned on: Subdomains of mydomain ON Examples: www.mydomain.com -and- apps.mydomain.com -and- store.mydomain.com Multiple top-level domains of mydomain ON Examples: mydomain.uk -and- mydomain.cn -and- mydomain.fr Which gave me the following code: _gaq.push(['_setAccount', 'UA-123456789-1']); _gaq.push(['_setDomainName', 'mydomain.com']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview']); In this help page I read that _setDomainName must be changed for each domain which I did: -if you go to mydomain.net you get _gaq.push(['_setDomainName', 'mydomain.net']); -if you go to mydomain.com you get _gaq.push(['_setDomainName', 'mydomain.com']); When I generate traffic on both mydomain.dom and mydomain.net and watches GA push requests made with firebug I can see requests generated for both domains and the parameter called utmhn has the proper domain value (which matches that of _setDomainName and the browser address bar). However when I monitor the realtime statistics under Home->Real-Time->Overview I see pageviews for mydomain.net BUT NOT for mydomain.dom :( What am I missing to properly track both domains? PS: in the help page I mentioned they talk about setting up cross links which I didn't do for now as my understanding is that it shouldn't be needed to get what I'm trying to do to work. Also I want to mention that I do not have any tracking code for any of these 2 domains other than the one I mentioned.

    Read the article

< Previous Page | 261 262 263 264 265 266 267 268 269 270 271 272  | Next Page >