Search Results

Search found 18126 results on 726 pages for 'bsi support'.

Page 436/726 | < Previous Page | 432 433 434 435 436 437 438 439 440 441 442 443  | Next Page >

  • SQL SERVER – Storing Variable Values in Temporary Array or Temporary List

    - by pinaldave
    SQL Server does not support arrays or a dynamic length storage mechanism like list. Absolutely there are some clever workarounds and few extra-ordinary solutions but everybody can;t come up with such solution. Additionally, sometime the requirements are very simple that doing extraordinary coding is not required. Here is the simple case. Let us say here are the values: a, 10, 20, c, 30, d. Now the requirement is to store them in a array or list. It is very easy to do the same in C# or C. However, there is no quick way to do the same in SQL Server. Every single time when I get such requirement, I create a table variable and store the values in the table variables. Here is the example: For SQL Server 2012: DECLARE @ListofIDs TABLE(IDs VARCHAR(100)); INSERT INTO @ListofIDs VALUES('a'),('10'),('20'),('c'),('30'),('d'); SELECT IDs FROM @ListofIDs; GO When executed above script it will give following resultset. Above script will work in SQL Server 2012 only for SQL Server 2008 and earlier version run following code. DECLARE @ListofIDs TABLE(IDs VARCHAR(100), ID INT IDENTITY(1,1)); INSERT INTO @ListofIDs SELECT 'a' UNION ALL SELECT '10' UNION ALL SELECT '20' UNION ALL SELECT 'c' UNION ALL SELECT '30' UNION ALL SELECT 'd'; SELECT IDs FROM @ListofIDs; GO Now in this case, I have to convert numbers to varchars because I have to store mix datatypes in a single column. Additionally, this quick solution does not give any features of arrays (like inserting values in between as well accessing values using array index). Well, do you ever have to store temporary multiple values in SQL Server – if the count of values are dynamic and datatype is not specified early how will you about storing values which can be used later in the programming. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Adding multiple Exchange accounts under one profile in Outlook 2010 or 2013

    - by Karel Smutný
    I work for two companies, each having own Exchange Server. I want to configure my Outlook for both email accounts. I am not able to add both accounts under the same profile in Outlook 2010, nor 2013. I found some how-to articles, each involving Office Configuration Tool. However, my installation does not support this tool. I have Office 2010 Home and Business, and Office 2013 Preview click-to-run. Is there another way? P.S. Having two different profiles, one for each Exchange account, is inconvenient. I cannot run two instances of Outlook at the same time and switching all the time is tedious.

    Read the article

  • How to boot from a flash drive OS using VirtualBox?

    - by kokbira
    I have two flash drive, one with Slax installed and another for Android x86 Live installed, but they do not boot in my laptop (in my work they boot perfectly). I can boot from some live CDs/DVDs or its ISO files using VirtualBox, but I cannot do it for live flash drives - I put the flash drives and start a VirtualBox without any virtual HD, but VirtualBox does not recognize them as boot options, as it does for CDs/DVDs. Any ideas? Any alternatives if VirtualBox does not support it? Edit1: I'm using Windows (Windows 7) but I would like to know how to do it in Linux (Ubuntu, for example) too.

    Read the article

  • Ubuntu and mysql server. Something isnt allowing me to connect

    - by acidzombie24
    I have a question about mysql settings http://serverfault.com/questions/94054/remote-connections-and-mysql-on-ubuntu/94088#94088 now i want to figure out why i cannot connect. I made sure bind-address was commented out. I can ping the server within the VM but i cannot ping it from within the VM using mysqladmin --protocol=tcp --host=self_ip ping. I also followed along and check if my ports were open and they look like they are. I setup samba on that VM and can access that with no problem as well. It looks like ubuntu does not have a firewall either (i figured this out before) so i am stumped why the server isnt allowing my connection. Apparently the config file works on another person side http://www.pastie.org/742545 I am using Ubuntu 6.06 LTS just because of 'support' reasons. So hopefully this will be 'easy'?

    Read the article

  • Oracle RAC interconnect in a Dell M1000e Blade Enclosure

    - by Antitribu
    We are looking at a Dell M1000e enclosure and appropriate Blades with 4 NICs each. We are planning on running Linux/Oracle 11g RAC on two blades, storage will be handled on an iSCSI SAN for which two NICs (via passthrough) will be connected leaving us with two NICs (via blade centre switches). We would like to have an interconnect (obviously) , an external IP and an internal IP. Would best practice be to: bond the remaining two interfaces and VLAN as appropriate to provide three virtual interfaces? run the interconnect on one interface and VLAN the external/internal interfaces? purchase a blade with more NICs as the above is a terrible idea? Another option? Please feel free to point out the blindingly obvious or to relevant documentation on support.oracle. I am specifically interested in supported configurations and best practices. Thanks!

    Read the article

  • What's a good public access terminal solution using old PCs and remote VMs?

    - by greenfingers
    Has anyone had experience using VMs as remote desktops for public access terminals (e.g. an internet cafe) In our case we don't want to charge money for access but I figure this solution has a few advantages, such as: can easily re-build VMs daily, erasing private data and clutter can use rickety old PCs for the 'dumb' terminals less IT support needed on site Can you suggest tools to help do this? Keeping the terminals up and running as much of the time as possible is the main priority, so they need to boot straight into full screen remote desktop and stay there.

    Read the article

  • Forum engine with full LDAP integration [closed]

    - by Andrian Nord
    We are looking for forum engine which may actually maintain user data into LDAP, maybe via mods. Core point is about ability to maintain the data, i.e. all user profile settings, like nickname, password, email, avatar, birthday and others (preferably configurable). One example of good ldap integration, level of which I'm expecting, is drupal's ldap integration, which allows to map any user's attribute into ldap and keeps it in sync with database. Year ago I've done a small research over existing Free&FOSS engines and find out few forum engines with LDAP integration, namely SFM, phpBB and something else. The most maintained solution were provided by phpBB3, which supports LDAP integration out-of-box, but it is unable to sync data with changes in LDAP server made by other software. Actually it wasn't even propagating changes back, I'm not saying about ability to map additional attributes (other than name/password/email). Also, I haven't found any forum with architecture which have proper abstraction over user settings, thus I doubt that this engines (including phpBB) are possible to mod such functionality without introducing dramatic changes into core codebase. More recent research showed that even some commercial software, like IPB is unable to keep it's database synced with LDAP directory and map additional attributes. In other words, all support I've seen so far is simple user creation upon first user's login, which is not good for us, as forum is not primary site and should not maintain it's own users base (to reduce risk of possible collisions). LDAP import is required due to many other services (ftp, email, jabber, drupal site) using same users base. Currently we have forum embedded into Drupal site, but we are unsatisfied with it's features. BTW, we are using Linux and this is not duplicate of this question, as it's author seems to be satisfied with behaviour described above. So, my question is: Are there any (preferably FOSS&free) forum engines that may import, export, keep in sync, or otherwise integrade with LDAP user database (preferably with ability to map additional fields to ldap attributes)?

    Read the article

  • Oracle Fusion Applications User Experience White Papers by Anna M. Wichansky

    - by JuergenKress
    The Applications User Experience group has created a series of white papers to better communicate the world-class user experience features of the Oracle Fusion Applications, and to describe the process we used to design them. These papers not only explain why the Oracle Fusion Applications User Experience is designed the way it is, but also the data collection, modeling, and testing efforts of our unique, user-focused design process, which contributed to its refinement. The documents we are sharing with product announcement are: Applications User Experience Research and Design Process White Paper New Oracle Fusion Applications: An End-User Experience Designed for Productivity Why Oracle Expects Productivity Gains with Fusion Applications Closing the Deal: the Oracle Fusion Customer Relationship Management User Experience Oracle Fusion Human Capital Management: Designed for a Productive Workforce Get It Done Fast, Get It Done Right: The Oracle Fusion Financials User Experience Oracle Fusion Applications User Experience Design Patterns: Productivity Realized Oracle Fusion Procurement: Changing the Way You Buy and Sell Putting the User into Oracle Fusion Applications User Assistance Read the full article here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. BlogTwitterLinkedInMixForumWiki Technorati Tags: User Experience,Design patterns,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • How can create the smallest possible mirror of the archive?

    - by Registered User
    I need to create an http url at my laptop to have a Ubuntu installation begin within my laptop on a Xen environment. This is how the final thing will look like. The host and client are both going to be my laptop, I Googled and came across apt-mirror and some other packages. I do not want to archive entire 15 GB Ubuntu repositories on my machine. It is not possible to use a CD,ISO,loop mounted disk (reason mentioned below). I have tried using netboot image on local machine which failed because if you are attempting to create a virtual machine on a hardware which does not support VT virt-manager installer necessarily needs a URL of this sort http://archive.ubuntu.com/ubuntu/dists/hardy/main/installer-i386/current/images/netboot/ Any other option to create guest OS is simply grayed out. The unfortunate part is my Ethernet connections do not work when I boot with Xen-4.0 and a pv-ops Dom0 kernel from Jeremy's tree. Which is where I have to do this work. So I have to create a URL structure which is similar to Ubuntu mirrors. So how can I do this in bare minimum so that at least the console boots and once the console comes I can do some work.

    Read the article

  • I can't launch any Win8 apps after upgrading to Windows 8.1

    - by locka
    I just upgraded to 8.1 and now none of the Metro apps start. The issue is that if I start any metro app, including the Store and PC Settings they immediately fail. The classic desktop is fine, as are standard programs, it's just the metro apps. If I look in the system event log I see errors like this: *Activation of application winstore_cw5n1h2txyewy!Windows.Store failed with error: This application does not support the contract specified or is not installed. See the Microsoft-Windows-TWinUI/Operational log for additional information.* In addition the tiles in metro have a small cross icon on them: I suspect that my Live ID (which I somehow managed to skip during update) is not set properly and consequently none of the online stuff works. But how do I fix this? I can't start PC settings, I can't start store. I see no way in the classic desktop of setting these things. I don't want to have to reinstall for this. Is there a simple fix?

    Read the article

  • Can I rely on S3 to keep my data secure?

    - by Jamie Hale
    I want to back up sensitive personal data to S3 via an rsync-style interface. I'm currently using s3cmd - a great tool - but it doesn't yet support encrypted syncs. This means that while my data is encrypted (via SSL) during transfer, it's stored on their end unencrypted. I want to know if this is a big deal. The S3 FAQ says "Amazon S3 uses proven cryptographic methods to authenticate users... If you would like extra security, there is no restriction on encrypting your data before storing it in Amazon S3." Why would I like extra security? Is there some way my buckets could be opened to prying eyes without my knowing? Or are they just trying to save you when you accidentally change your ACLs and make your buckets world-readable?

    Read the article

  • SQL Server 2008 R2 100% availability

    - by Mark Henderson
    Is there any way to provide 100% uptime on SQL Server 2008 R2? From my experience, the downtimes for the different replication methods are: Log Shipping: Lots (for DR only) Mirroring w. NLB: ~ 45 seconds Clustering: ~ 5-15 seconds And all of these solutions involve all of the connections being dropped from the source, so if the downtime is too long or the app's gateway doesn't support reconnection in the middle of task, then you're out of luck. The only way I can think to get around this is to abstract the clustering a level (by virtualising and then enabling VMWare FT. Yuck. Good luck getting this to work on a quad-socket, 32-core system anyway.). Is there any other way of providing 100% uptime of SQL Server?

    Read the article

  • Additional new content SOA Partner Community

    - by JuergenKress
    Oracle Reference Architecture: Application Infrastructure Foundation One of the earliest additions to the IT Strategies from Oracle library, this paper describes the concepts and capabilities of the application infrastructure and defines the platform on which solutions are built. Read it. Scaling Service Oriented Architecture What is scaling, and what does it mean to a service oriented architecture? Author Philip Wik explores those issues and proposes Oracle-based solutions to SOA scaling and a SOA scaling roadmap. Read it. SOA, Cloud, and Service Technologies: A Conversation with Thomas Erl Thomas Erl, the world's best selling SOA author, is joined by Oracle SOA experts Tim Hall and Demed L'Her for a wide ranging four-part conversation on the evolution of SOA and the emergence of the architect in the era of cloud computing. Listen to the Podcast & Read a Transcript Cloud e-book Invite your customers to download this Cloud e-book, packed with multi-media resources to educate your customers on the value of Oracle Cloud computing. Assessment: Are you Leading or Lagging when it comes to SOA and BPM? Take the online SOA Assessment and BPM Assessment. New Collateral: Whitepaper Series: The Promise of BPM Technology for Financial Services Institutions - Resource Kit Whitepaper: Reaching Process Excellence with Process Accelerators - PDF Demystifying Cloud Integration: Whitepapers, webcasts, and customer case studies - Resource Kit Whitepaper: Leveraging Governance to sustain Enterprise Architecture - PDF Article: Rethink SOA: A Recipe for Business Transformation - Article Oracle SOA Resource Kit Oracle SOA Governance Resource Kit Oracle BPM Resource Kit SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • How do I fake 2 discrete monitors using a DualHead2Go?

    - by Sietse
    I just got a [Matrox Dualhead2Go|http://www.matrox.com/graphics/en/products/gxm/dh2go/] for use with my MacBook Pro. I realise that the reason it works is that it fakes 1 big (wide) monitor. I also kind of depended on the software that came with it to trick OSX into accepting it as 2 monitors. Turns out the support is kind of lame: it just adds shortcuts for maximizing the window to whatever screen you want. And it even gets that wrong, since my dock doesn't auto-hide, but it doesn't take it in account while resizing, causing my window do end up "behind" my dock. (I've made a AppleScript that does the resize correctly, that I'll post below). There's two glaring issues this causes: Full screen (video, etc.) takes up both monitors, and dialogs just pops up in the middle. Is there a way to trick OSX, or at least a way to fix these issues?

    Read the article

  • Making WIF local STS to work with your ASP.NET application

    - by DigiMortal
    Making Windows Identity Foundation (WIF) STS test application work with your solution is not as straightforward process as you can read from books and articles. There are some tricks and some configuration modifications you must do to get things work. Fortunately these steps are simple one. 1. Move your application to IIS or IIS Express If your application uses development web server that ships with Visual Studio then make your application use IIS or IIS Express. You get simple support for IIS Express to Visual Studio 2010 after installing Visual Studio 2010 SP1. You can read more from my blog posting Visual Studio 2010 SP1 Beta supports IIS Express. NB! You don’t have to move your dummy STS project to IIS. 2. Change request validation mode to ASP.NET 2.0 As a next thing you will get the following error when coming back from dummy STS service: HttpRequestValidationException (0x80004005): A potentially dangerous Request.Form value was detected from the client. Open web.config of your application and add the following line before </system.web>: <httpRuntime requestValidationMode="2.0" /> Now you are done with configuring web application to work with STS.

    Read the article

  • Active Directory: how do you pull a list of accounts that belong to a user?

    - by Jack
    I'm a software developer currently stuck to support CyberArk at a large company. I need to pull up a list of accounts that is belong to a certain user. For example, let's said I have a user account named Bob and I want to find all the accounts that belong to Bob in AD meaning on the Organization tab of a user properties, the Manager should be Bob. I've absolutely zero knowledge regarding AD except the very basic. Is there a way to do it? I only have access to "Active Directory Users and Computers" tool and I'm not even sure if I have enough privilege to run script or install powershell to do it but would like to know the script or powershell command to do so if there is one.

    Read the article

  • windvd aacs keys update dosn't work

    - by Jeremy French
    A relative has a viao htpc, which has blue ray. The player I believe is a viao branded version of windvd with blue ray support. Every time it loads it asks if it should update the aacs keys as they are out of date. If you select yes nothing happens. The package just sits there forever saying that you shouldn't turn it off. I have tried to go to the products website (which is now corel) registering and downloading the keys, but the key download page does nothing. Has anyone had a similar problem, or are there any suggestions to get around it?

    Read the article

  • Why Are Minimized Programs Often Slow to Open Again?

    - by Jason Fitzpatrick
    It seems particularly counterintuitive: you minimize an application because you plan on returning to it later and wish to skip shutting the application down and restarting it later, but sometimes maximizing it takes even longer than launching it fresh. What gives? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites. The Question SuperUser reader Bart wants to know why he’s not saving any time with application minimization: I’m working in Photoshop CS6 and multiple browsers a lot. I’m not using them all at once, so sometimes some applications are minimized to taskbar for hours or days. The problem is, when I try to maximize them from the taskbar – it sometimes takes longer than starting them! Especially Photoshop feels really weird for many seconds after finally showing up, it’s slow, unresponsive and even sometimes totally freezes for minute or two. It’s not a hardware problem as it’s been like that since always on all on my PCs. Would I also notice it after upgrading my HDD to SDD and adding RAM (my main PC holds 4 GB currently)? Could guys with powerful pcs / macs tell me – does it also happen to you? I guess OSes somehow “focus” on active software and move all the resources away from the ones that run, but are not used. Is it possible to somehow set RAM / CPU / HDD priorities or something, for let’s say, Photoshop, so it won’t slow down after long period of inactivity? So what is the deal? Why does he find himself waiting to maximize a minimized app? The Answer SuperUser contributor Allquixotic explains why: Summary The immediate problem is that the programs that you have minimized are being paged out to the “page file” on your hard disk. This symptom can be improved by installing a Solid State Disk (SSD), adding more RAM to your system, reducing the number of programs you have open, or upgrading to a newer system architecture (for instance, Ivy Bridge or Haswell). Out of these options, adding more RAM is generally the most effective solution. Explanation The default behavior of Windows is to give active applications priority over inactive applications for having a spot in RAM. When there’s significant memory pressure (meaning the system doesn’t have a lot of free RAM if it were to let every program have all the RAM it wants), it starts putting minimized programs into the page file, which means it writes out their contents from RAM to disk, and then makes that area of RAM free. That free RAM helps programs you’re actively using — say, your web browser — run faster, because if they need to claim a new segment of RAM (like when you open a new tab), they can do so. This “free” RAM is also used as page cache, which means that when active programs attempt to read data on your hard disk, that data might be cached in RAM, which prevents your hard disk from being accessed to get that data. By using the majority of your RAM for page cache, and swapping out unused programs to disk, Windows is trying to improve responsiveness of the program(s) you are actively using, by making RAM available to them, and caching the files they access in RAM instead of the hard disk. The downside of this behavior is that minimized programs can take a while to have their contents copied from the page file, on disk, back into RAM. The time increases the larger the program’s footprint in memory. This is why you experience that delay when maximizing Photoshop. RAM is many times faster than a hard disk (depending on the specific hardware, it can be up to several orders of magnitude). An SSD is considerably faster than a hard disk, but it is still slower than RAM by orders of magnitude. Having your page file on an SSD will help, but it will also wear out the SSD more quickly than usual if your page file is heavily utilized due to RAM pressure. Remedies Here is an explanation of the available remedies, and their general effectiveness: Installing more RAM: This is the recommended path. If your system does not support more RAM than you already have installed, you will need to upgrade more of your system: possibly your motherboard, CPU, chassis, power supply, etc. depending on how old it is. If it’s a laptop, chances are you’ll have to buy an entire new laptop that supports more installed RAM. When you install more RAM, you reduce memory pressure, which reduces use of the page file, which is a good thing all around. You also make available more RAM for page cache, which will make all programs that access the hard disk run faster. As of Q4 2013, my personal recommendation is that you have at least 8 GB of RAM for a desktop or laptop whose purpose is anything more complex than web browsing and email. That means photo editing, video editing/viewing, playing computer games, audio editing or recording, programming / development, etc. all should have at least 8 GB of RAM, if not more. Run fewer programs at a time: This will only work if the programs you are running do not use a lot of memory on their own. Unfortunately, Adobe Creative Suite products such as Photoshop CS6 are known for using an enormous amount of memory. This also limits your multitasking ability. It’s a temporary, free remedy, but it can be an inconvenience to close down your web browser or Word every time you start Photoshop, for instance. This also wouldn’t stop Photoshop from being swapped when minimizing it, so it really isn’t a very effective solution. It only helps in some specific situations. Install an SSD: If your page file is on an SSD, the SSD’s improved speed compared to a hard disk will result in generally improved performance when the page file has to be read from or written to. Be aware that SSDs are not designed to withstand a very frequent and constant random stream of writes; they can only be written over a limited number of times before they start to break down. Heavy use of a page file is not a particularly good workload for an SSD. You should install an SSD in combination with a large amount of RAM if you want maximum performance while preserving the longevity of the SSD. Use a newer system architecture: Depending on the age of your system, you may be using an out of date system architecture. The “system architecture” is generally defined as the “generation” (think generations like children, parents, grandparents, etc.) of the motherboard and CPU. Newer generations generally support faster I/O (input/output), better memory bandwidth, lower latency, and less contention over shared resources, instead providing dedicated links between components. For example, starting with the “Nehalem” generation (around 2009), the Front-Side Bus (FSB) was eliminated, which removed a common bottleneck, because almost all system components had to share the same FSB for transmitting data. This was replaced with a “point to point” architecture, meaning that each component gets its own dedicated “lane” to the CPU, which continues to be improved every few years with new generations. You will generally see a more significant improvement in overall system performance depending on the “gap” between your computer’s architecture and the latest one available. For example, a Pentium 4 architecture from 2004 is going to see a much more significant improvement upgrading to “Haswell” (the latest as of Q4 2013) than a “Sandy Bridge” architecture from ~2010. Links Related questions: How to reduce disk thrashing (paging)? Windows Swap (Page File): Enable or Disable? Also, just in case you’re considering it, you really shouldn’t disable the page file, as this will only make matters worse; see here. And, in case you needed extra convincing to leave the Windows Page File alone, see here and here. Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.     

    Read the article

  • New partnership allows auto-transposition of client/server application to Windows Azure

    - by Webgui
    The economics of IT is changing rapidly, and organizations are searching to widen and secure availability of their systems and at the same time lower costs which is exactly what the cloud meant to do. Running your systems on Microsoft’s Windows Azure cloud for example would improve and secure the availability, accessibility and scalability (both up and down) of your systems and support the new IT economics. However, in order to take advantage of the cloud's promise of lower cost of ownership, the applications must be built or adjusted to work on that platform and in most cases this is not a simple task.  Even existing web applications cannot always be transferred to Azure without some changes, and for client/server applications, the task is way more challenging even to the point where it seems impossible. The reason is the gaps between the client/server desktop technology and the cloud's. For that reason, most of the known methodologies to migrate existing client/server applications actually involve rewrite of the desktop systems for the cloud. A unique approach is introduced by Visual WebGui which creates a virtualization layer atop ASP.Net web server, it moves the transformed or generated .Net code to that layer, and then using a patent pending protocol it renders a user interface within a plain browser. The end result is pure .NET code that is a base code for a pure rich web application and now due to a collaboration with Microsoft Windows Azure Visual WebGui provides the shortest path from client/server to the Azure cloud by being able to handle close to 95% of the transformation to the cloud platform in an automatic way. Application Migration to Azure without migraines More information about the Instant CloudMove Azure solution here.

    Read the article

  • Preffered lambda syntax?

    - by Roger Alsing
    I'm playing around a bit with my own C like DSL grammar and would like some oppinions. I've reserved the use of "(...)" for invocations. eg: foo(1,2); My grammar supports "trailing closures" , pretty much like Ruby's blocks that can be passed as the last argument of an invocation. Currently my grammar support trailing closures like this: foo(1,2) { //parameterless closure passed as the last argument to foo } or foo(1,2) [x] { //closure with one argument (x) passed as the last argument to foo print (x); } The reason why I use [args] instead of (args) is that (args) is ambigious: foo(1,2) (x) { } There is no way in this case to tell if foo expects 3 arguments (int,int,closure(x)) or if foo expects 2 arguments and returns a closure with one argument(int,int) - closure(x) So thats pretty much the reason why I use [] as for now. I could change this to something like: foo(1,2) : (x) { } or foo(1,2) (x) -> { } So the actual question is, what do you think looks best? [...] is somewhat wrist unfriendly. let x = [a,b] { } Ideas?

    Read the article

  • How to configure Windows 2008 R2 server for LAN and wireless internet connections

    - by Alchemical
    For special testing purposes, we need a Windows server to allow the following: A team member can log in remotely to the server. When remotely logged in, they can disconnect the wireless connection, perform a few tests, and then reconnect the wireless connection. In general, the LAN connection would just be used for the remote login, the wireless connection would be used for performing tests including using a web browser to test certain web sites, etc. How can we successfully configure the server to support 2 network connections like this? (A regular LAN connection + a wireless connection). And also make sure that the tests we perform using the browser utilize the wireless connection for the outgoing internet activity.

    Read the article

  • Disk / system configuration for log collection / syslog server

    - by Konrads
    I am looking into building a syslog / logging infrastructure and am pondering about some architecture best practices. Essentially, I see that a syslog system needs to support two conflicting workloads: log collection. Potentially massive streams of data need to be written quickly to disks and indexed. log querying. logs will be queried by both fixed fields such as date and source as well as text search. What is the best disk/system setup assuming I'd like to keep it to a single server for now? Should I use SSDs or ramdisk to off-load some processing? some disks in stripe and some in raid5? I am particularly eyeing Graylog2 with ElasticSearch/MongoDB

    Read the article

  • Oracle Endeca Information Discovery 3.1 is Now Available

    - by p.anda
    Oracle Endeca Information Discovery (OEID) 3.1 is a major release that incorporates significant new self-service discovery capabilities for business users. These include agile data mashup, extended support for unstructured analytics, and an even tighter integration with Oracle BI This release is available for download from: Oracle Delivery Cloud Oracle Technology Network Some of the what's new highlights ... Self-service data mashup... enables access to a wider variety of personal and trusted enterprise data sources. Blend multiple data sets in a single app. Agile discovery dashboards... allows users to easily create, configure, and securely share discovery dashboards with intelligent defaults, intuitive wizards and drag-and-drop configuration. Deeper unstructured analysis ... enables users to enrich text using term extraction and whitelist tagging while the data is live. Enhanced integration with OBI... provides easier wizards for data selection and enables OBI Server as a self-service data source. Enterprise-class data discovery... offers faster performance, a trusted data connection library, improved auditing and increased data connectivity for Hadoop, web content and Oracle Data Integrator. Find out more ... visit the OEID Overview page to download the What's New and related Data Sheet PDF documents. Have questions or want to share details for Oracle Endeca Information Discovery?  The MOS Communities is a great first stop to visit and you can stop-by at MOS OEID Community.

    Read the article

  • Cloud based backup solutions based on open standards?

    - by Rick
    I am looking for a solution to backup and consolidate important media from a couple Windows laptops and Mac laptop. I would like a solutions that based on open standards, so my data isn't trapped by proprietary formats and proprietary protocols. I would like the ability to switch clients or change providers in the future. For example, something like Jungle Disk plus S3 sounds like a great option. However, I am having trouble confirming how or if this can be setup meeting this criteria. Are there any real or de-facto standards for treating S3 as a filesystem? If so, what Windows and Mac clients support these standards?

    Read the article

  • Is any EXIF data stored within 3rd party Camera apps on the iPhone?

    - by 3rdparty
    I'm confused as to if any EXIF data is available when taking photos within 3rd party camera apps on the iPhone. My understanding is that Apple is currently not allowing any apps to save EXIF data to photos, and this is a limitation of saving to the camera roll on the phone. The last FAQ on this page indicates this, but appears to be out of date: http://www.codegoo.com/page/support I love some of the camera apps I've downloaded (Camera Genius, Best Camera, CameraBag) but don't want to continue using them if they aren't saving any/all EXIF data for the image. Anyone aware what the status of this 'limitation' is?

    Read the article

< Previous Page | 432 433 434 435 436 437 438 439 440 441 442 443  | Next Page >