Search Results

Search found 26618 results on 1065 pages for 'amazon instance store'.

Page 306/1065 | < Previous Page | 302 303 304 305 306 307 308 309 310 311 312 313  | Next Page >

  • For an ORM supporting data validation, should constraints be enforced in the database as well?

    - by Ramnique Singh
    I have always applied constraints at the database level in addition to my (ActiveRecord) models. But I've been wondering if this is really required? A little background I recently had to unit test a basic automated timestamp generation method for a model. Normally, the test would create an instance of the model and save it without validation. But there are other required fields that aren't nullable at the in the table definition, meaning I cant save the instance even if I skip the ActiveRecord validation. So I'm thinking if I should remove such constraints from the db itself, and let the ORM handle them? Possible advantages if I skip constraints in db, imo - Can modify a validation rule in the model, without having to migrate the database. Can skip validation in testing. Possible disadvantage? If its possible that ORM validation fails or is bypassed, howsoever, the database does not check for constraints. What do you think? EDIT In this case, I'm using the Yii Framework, which generates the model from the database, hence database rules are generated also (though I could always write them post-generation myself too).

    Read the article

  • Making a file with terminal commands?

    - by iSeth
    In Windows I can write a file containing commands for cmd (usually .cmd or .bat files). When I click on those files it will open cmd.exe and run them commands the file contains. How would I do this in Ubuntu? I'm sure this is a duplicate, but I can't find my answer. Its similar to these questions, but they don't answer the question: Store frequently used terminal commands in a file CMD.exe Emulator in Ubuntu to run .cmd/.bat file

    Read the article

  • Angry Birds and Star Wars Join Forces for an Awesome New Edition [Plus Wallpaper!]

    - by Asian Angel
    Are you ready for a new version of Angry Birds? Then rejoice, you are less than a month away from an awesome new release of everyone’s favorite bird-slinging, pig smashing game! Prepare for a journey to a galaxy far, far away… From the blog post: From the deserts of Tatooine to the depths of the Death Star – the game and merchandise will feature the Angry Birds characters starring as the iconic heroes of the beloved Saga. In the coming weeks, fans can expect additional new videos, characters, and much more exciting content to be revealed. The game will be available on iOS, Android, Amazon Kindle Fire, Mac, PC, Windows Phone and Windows 8. Here is the first of the promo videos for the new version. Also, make sure to download the first official wallpaper (linked to below)! How To Get a Better Wireless Signal and Reduce Wireless Network Interference How To Troubleshoot Internet Connection Problems 7 Ways To Free Up Hard Disk Space On Windows

    Read the article

  • SMTP POP3 & PST. Acronyms from Hades.

    - by mikef
    A busy SysAdmin will occasionally have reason to curse SMTP. It is, certainly, one of the strangest events in the history of IT that such a deeply flawed system, designed originally purely for campus use, should have reached its current dominant position. The explanation was that it was the first open-standard email system, so SMTP/POP3 became the internet standard. We are, in consequence, dogged with a system with security weaknesses so extreme that messages are sent in plain text and you have no real assurance as to who the message came from anyway (SMTP-AUTH hasn't really caught on). Even without the security issues, the use of SMTP in an office environment provides a management nightmare to all commercial users responsible for complying with all regulations that control the conduct of business: such as tracking, retaining, and recording company documents. SMTP mail developed from various Unix-based systems designed for campus use that took the mail analogy so literally that mail messages were actually delivered to the users, using a 'store and forward' mechanism. This meant that, from the start, the end user had to store, manage and delete messages. This is a problem that has passed through all the releases of MS Outlook: It has to be able to manage mail locally in the dreaded PST file. As a stand-alone system, Outlook is flawed by its neglect of any means of automatic backup. Previous Outlook PST files actually blew up without warning when they reached the 2 Gig limit and became corrupted and inaccessible, leading to a thriving industry of 3rd party tools to clear up the mess. Microsoft Exchange is, of course, a server-based system. Emails are less likely to be lost in such a system if it is properly run. However, there is nothing to stop users from using local PSTs as well. There is the additional temptation to load emails into mobile devices, or USB keys for off-line working. The result is that the System Administrator is faced by a complex hybrid system where backups have to be taken from Servers, and PCs scattered around the network, where duplication of emails causes storage issues, and document retention policies become impossible to manage. If one adds to that the complexity of mobile phone email readers and mail synchronization, the problem is daunting. It is hardly surprising that the mood darkens when SysAdmins meet and discuss PST Hell. If you were promoted to the task of tormenting the souls of the damned in Hades, what aspects of the management of Outlook would you find most useful for your task? I'd love to hear from you. Cheers, Michael

    Read the article

  • Caption Competition 4: Fist Full of Captions

    - by Simple-Talk Editorial Team
    Once again we ask: What’s going on here? The best caption wins a $50 Amazon voucher. Computer-y answers for preference, but don’t let a lack of electronics stop you from dazzling us with your bon mots. Some examples to set you on your merry way: “You know what it’s like. Someone turns up to an interview in a long coat, they seem fine, but when they start the job it turns out it was a bunch of penguins.” “When I said we needed cold callers, this wasn’t really what I meant.” “Linux developers seek inspiration for new Logo” “Residents of Antarctica hold press conference to protest about Global Warming.”  You can do better. Make us laugh, win fabulous prizes. Answers in the comments, please.

    Read the article

  • Need Sql Server Hosting 50GB or More

    - by Leo
    Hi I am looking for a Hosting solution (Dedicated or Shared) which will allow me to host a SQL Server database service (Not SQL Express but the Web edition). The size of my database might grow to 50GB or more. The web application will offer more reads than write operations. I also need daily backups and raid 1 storage. Is there a reliable and economical hosting company that would provide this? Additional Question: If there is a easy way to host MS SQL on Amazon EC2 service, it will be preferable.

    Read the article

  • I need a flexible VPS like gandi.com [duplicate]

    - by Sharen Eayrs
    This question already has an answer here: How to find web hosting that meets my requirements? 5 answers What are the alternative? I do not ask for which one you recommend. I am simply asking for list of similar service. That being said if there is any you've used and happy with please let me know. This is what I am considering http://en.gandi.net/ Some says it's overpriced. It's like amazon aws I think. So I need VPS hosting where I can easily change the CPU requirement, etc.

    Read the article

  • Programming Entity Framework, 2nd Edition (EF4) Table of Contents

    We are closing in on finalizing the 2nd edition of Programming Entity Framework! Although the rough draft chapters are already available through Safari’s Rough Cuts program (here: http://oreilly.com/catalog/9780596807252) I have been editing and reshaping the content since those chapters were published. You can get the final print edition (August 15th or perhaps a bit earlier) at O’Reilly or pre-order it here on Amazon.com (here) (and elsewhere of course!) I believe that the book will...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Are separate business objects needed when persistent data can be stored in a usable format?

    - by Kylotan
    I have a system where data is stored in a persistent store and read by a server application. Some of this data is only ever seen by the server, but some of it is passed through unaltered to clients. So, there is a big temptation to persist data - whether whole rows/documents or individual fields/sub-documents - in the exact form that the client can use (eg. JSON), as this removes various layers of boilerplate, whether in the form of procedural SQL, an ORM, or any proxy structure which exists just to hold the values before having to re-encode them into a client-suitable form. This form can usually be used on the server too, though business logic may have to live outside of the object, On the other hand, this approach ends up leaking implementation details everywhere. 9 times out of 10 I'm happy just to read a JSON structure out of the DB and send it to the client, but 1 in every 10 times I have to know the details of that implicit structure (and be able to refactor access to it if the stored data ever changes). And this makes me think that maybe I should be pulling this data into separate business objects, so that business logic doesn't have to change when the data schema does. (Though you could argue this just moves the problem rather than solves it.) There is a complicating factor in that our data schema is constantly changing rapidly, to the point where we dropped our previous ORM/RDBMS system in favour of MongoDB and an implicit schema which was much easier to work with. So far I've not decided whether the rapid schema changes make me wish for separate business objects (so that server-side calculations need less refactoring, since all changes are restricted to the persistence layer) or for no separate business objects (because every change to the schema requires the business objects to change to stay in sync, even if the new sub-object or field is never used on the server except to pass verbatim to a client). So my question is whether it is sensible to store objects in the form they are usually going to be used, or if it's better to copy them into intermediate business objects to insulate both sides from each other (even when that isn't strictly necessary)? And I'd like to hear from anybody else who has had experience of a similar situation, perhaps choosing to persist XML or JSON instead of having an explicit schema which has to be assembled into a client format each time.

    Read the article

  • After juju deploy : [Errno 2] No such file or directory

    - by user84471
    I get this after juju deploy. hsf@ubuntuserver:~$ juju deploy wordpress 2012-10-23 13:12:15,663 INFO Searching for charm cs:precise/wordpress in charm store [Errno 2] No such file or directory: '/home/hsf/.juju/cache/downloads/cs_3a_precise_2f_wordpress-9.charmGLg7sI.part' 2012-10-23 13:12:16,075 ERROR [Errno 2] No such file or directory: '/home/hsf/.juju/cache/downloads/cs_3a_precise_2f_wordpress-9.charmGLg7sI.part' I am using Ubuntu 12.04 server.

    Read the article

  • Encapsulating code in F# (Part 2)

    - by MarkPearl
    In part one of this series I showed an example of encapsulation within a local definition. This is useful to know so that you are aware of the scope of value holders etc. but what I am more interested in is encapsulation with regards to generating useful F# code libraries in .Net, this is done by using Namespaces and Modules. Lets have a look at some C# code first… using System; namespace EncapsulationNS { public class EncapsulationCLS { public static void TestMethod() { Console.WriteLine("Hello"); } } } Pretty simple stuff… now the F# equivalent…. namespace EncapsulationNS module EncapsulationMDL = let TestFunction = System.Console.WriteLine("Hello") ()   Even easier… lets look at some specifics about F# namespaces… Namespaces are open. meaning you can have multiple source files and assemblies can contribute to the same namespace. So, Namespaces are a great way to group modules together, so the question needs to be asked, what role do modules play. For me, the F# module is in many ways similar to the vb6 days of modules. In vb6 modules were separate files and simply allowed us to group certain methods together. I find it easier to visualize F# modules this way than to compare them to the C# classes. However that being said one is not restricted to one module per file – there is flexibility to have multiple modules in one code file however with my limited F# experience I would still recommend using the file as the standard level of separating modules as it is very easy to then find your way around a solution. An important note about interop with F# and other .Net languages. I wrote a blog post a while back about a very basic F# to C# interop. If I were to reference an F# library in a C# project (for instance ‘TestFunction’), in C# it would show this method as a static method call, meaning I would not have to instantiate an instance of the module.

    Read the article

  • How to disable Spotlight content indexing in Mac OS

    - by o.v.
    From Windows experience, I could always elect Live search to only index file names not their content. Is this something that can be done with Spotlight on a Mac? It used to index absolutely everything, for instance it would return a bunch of video files for any obscure character combination typed into the search field. Right now I've disabled Spotlight entirely as per this answer, but it seems to have disabled searching altogether. For instance, Finder is yet to locate any .pdf files in a small directory as I'm typing this question (unlike windows search which would still be able to work even with indexing disabled) Alternatively, if there is any way (including a trusted third-party app) that will index file names and metadata e.g. ID3 tags that would likely be the preferred option.

    Read the article

  • Asynchronously returning a hierarchal data using .NET TPL... what should my return object "look" like?

    - by makerofthings7
    I want to use the .NET TPL to asynchronously do a DIR /S and search each subdirectory on a hard drive, and want to search for a word in each file... what should my API look like? In this scenario I know that each sub directory will have 0..10000 files or 0...10000 directories. I know the tree is unbalanced and want to return data (in relation to its position in the hierarchy) as soon as it's available. I am interested in getting data as quickly as possible, but also want to update that result if "better" data is found (better means closer to the root of c:) I may also be interested in finding all matches in relation to its position in the hierarchy. (akin to a report) Question: How should I return data to my caller? My first guess is that I think I need a shared object that will maintain the current "status" of the traversal (started | notstarted | complete ) , and might base it on the System.Collections.Concurrent. Another idea that I'm considering is the consumer/producer pattern (which ConcurrentCollections can handle) however I'm not sure what the objects "look" like. Optional Logical Constraint: The API doesn't have to address this, but in my "real world" design, if a directory has files, then only one file will ever contain the word I'm looking for.  If someone were to literally do a DIR /S as described above then they would need to account for more than one matching file per subdirectory. More information : I'm using Azure Tables to store a hierarchy of data using these TPL extension methods. A "node" is a table. Not only does each node in the hierarchy have a relation to any number of nodes, but it's possible for each node to have a reciprocal link back to any other node. This may have issues with recursion but I'm addressing that with a shared object in my recursion loop. Note that each "node" also has the ability to store local data unique to that node. It is this information that I'm searching for. In other words, I'm searching for a specific fixed RowKey in a hierarchy of nodes. When I search for the fixed RowKey in the hierarchy I'm interested in getting the results FAST (first node found) but prefer data that is "closer" to the starting point of the hierarchy. Since many nodes may have the particular RowKey I'm interested in, sometimes I may want to get a report of ALL the nodes that contain this RowKey.

    Read the article

  • Enable a program in Windows to run multiple times?

    - by user135490
    I've got this legacy software that only allows you to run one copy at a time, it detects that you have another session opened and it won't allow you to open a second instance. The problem is this is a cpu intensive program and it only use a single core. Is there any hacks or tweaks so I can trick it and open more than one instance? This would allow me to retire about 5 servers... I'm using Windows 2008 R2. I had to use cff explorer to enable the use more than 2GB RAM as the program crashes when it tries to use more than 2GB.

    Read the article

  • Does saving my progress on a U1-synced file/folder put unneccesary strain on the servers?

    - by Chauncellor
    I love Ubuntu One and I use it all the time. I have my documents and music composition folders set to sync. It's been a real boon. However, sometimes I feel that constantly saving my progress forces the file to sync dozens and dozens of times to the servers. It seems wasteful to me so I've been disconnecting U1 until I'm finished working on a project. Is this an unnecessary action that I am taking? I know it's using Amazon's storage but I'm still paranoid that I'm costing Canonical money when I constantly save my progress.

    Read the article

  • A Deep Dive into Transport Queues - Part 1

    Submission queues? Poison message queues? Johan Veldhuis unlocks the mysteries of MS Exchange's Transport queues that used to temporarily store messages waiting until they are passed through to the next stage, and explains how to manage these queues.

    Read the article

  • Apps Script Office Hours - November 14, 2012

    Apps Script Office Hours - November 14, 2012 In this episode Eric ... - Covers the release notes from November 13, 2012 - Talks about a new feature that allows you to set a custom verified URL for your Apps Script web apps in the Chrome Web Store. - Answers a question about OAuth 2. The schedule of future episodes can be found at: developers.google.com From: GoogleDevelopers Views: 208 10 ratings Time: 11:38 More in Science & Technology

    Read the article

  • Android : KitKat embarqué sur 13,6% de terminaux, Jelly Bean chute à moins de 60%, toutes les anciennes versions ont enregistré une baisse

    Android : KitKat embarqué sur 13,6% de terminaux Jelly Bean chute à moins de 60%, toutes les anciennes versions ont enregistré une baisseLes développeurs Android ont de nouveaux chiffres sur la répartition des différentes versions d'Android sur l'ensemble des terminaux exécutant la plateforme mobile et ayant eu accès au Play Store entre mai et juin.Ces chiffres sont publiés chaque mois par Google pour permettre aux développeurs d'adapter leur stratégie de développement pour cibler le plus de terminaux...

    Read the article

  • Cost of Web Server that hosted and delivered text only

    - by slandau
    We are developing an application that needs a web server to interact with the two (or more) entities involved. They will not ever see anything on the web, but the server is required for the transfer of data between them. It's sort of a holding point. Now, the only thing the server is going to be holding is textual data. The two entities are going to be doing the work with the data. I was wondering what the cost of this type of server would be. Since it would be JUST a database with no front end, would it make sense to employ a service through Amazon or Google that just holds data for me to access instead of buying a server and making my own database? The amount of data can grow very large however it's only text, and all data over a day old will be deleted for the most part every day. Thanks!

    Read the article

  • Taking a Projects Development to the Next Level

    - by user1745022
    I have been looking for some advice for a while on how to handle a project I am working on, but to no avail. I am pretty much on my fourth iteration of improving an "application" I am working on; the first two times were in Excel, the third Time in Access, and now in Visual Studio. The field is manufacturing. The basic idea is I am taking read-only data from a massive Sybase server, filtering it and creating much smaller tables in Access daily (using delete and append Queries) and then doing a bunch of stuff. More specifically, I use a series of queries to either combine data from multiple tables or group data in specific ways (aggregate functions), and then I place this data into a table (so I can sort and manipulate data using DAO.recordset and run multiple custom algorithms). This process is then repeated multiple times throughout the database until a set of relevant tables are created. Many times I will create a field in a query with a value such as 1.1 so that when I append it to a table I can store information in the field from the algorithms. So as the process continues the number of fields for the tables change. The overall application consists of 4 "back-end" databases linked together on a shared drive, with various output (either front-end access applications or Excel). So my question is is this how many data driven applications that solve problems essentially work? Each backend database is updated with fresh data daily and updating each takes around 10 seconds (for three) and 2 minutes(for 1). Project Objectives. I want/am moving to SQL Server soon. Front End will be a Web Application (I know basic web-development and like the administration flexibility) and visual-studio will be IDE with c#/.NET. Should these algorithms be run "inside the database," or using a series of C# functions on each server request. I know you're not supposed to store data in a database unless it is an actual data point, and in Access I have many columns that just hold calculations from algorithms in vba. The truth is, I have seen multiple professional Access applications, and have never seen one that has the complexity or does even close to what mine does (for better or worse). But I know some professional software applications are 1000 times better then mine. So Please Please Please give me a suggestion of some sort. I have been completely on my own and need some guidance on how to approach this project the right way.

    Read the article

  • Reading input all together or in steps?

    - by nischayn22
    For many programming quizzes we are given a bunch of input lines and we have to process each input , do some computation and output the result. My question is what is the best way to optimize the runtime of the solution ? Read all input, store it (in array or something) ,compute result for all of them, finally output it all together. or 2. Read one input, compute the result, output the result and so on for each input given.

    Read the article

  • Virtual Box - not filling entire screen

    - by jdavis
    I am new to VirtualBox and am trying to set up an instance of Windows 7 64. I have the virtual machine instance running with Windows 7 now installed, but it only fills up a small portion of my screen. Even when I go full screen, the window stays the same size and the rest of the screen is filled with gray space. I have installed VirtualBox Guest Additions, which allowed me to go from a resolution of 800x600 to 1024x768, but this still isn't satisfactory as my laptop display is 1600x900. Any help on this would be most appreciated. Thanks.

    Read the article

< Previous Page | 302 303 304 305 306 307 308 309 310 311 312 313  | Next Page >