Search Results

Search found 17686 results on 708 pages for 'high level'.

Page 412/708 | < Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >

  • Compressing/compacting messages over websocket on Node.js

    - by icelava
    We have a websocket implementation (Node.js/Sock.js) that exchanges data as JSON strings. As our use cases grow, so have the size of the data transmitted across the wire. The websocket protocol does not natively offer any compression feature, so in order to reduce the size of our messages we'd have to manually do something about the serialisation. There appear to be a variety of LZW implementations in Javascript, some which confuses me on their compatibility for in-browser use only versus transmission across the wire due to my lack of understanding on low-level encodings. More importantly, all of them seem to take a noticeable performance drag when Javascript is the engine doing the compression/decompression work, which is not desirable for mobile devices. Looking instead other forms of compact serialisation, MessagePack does not appear to have any active support in Javascript itself; BSON does not have any Javascript implementation; and an alternative BISON project that I tested does not deserialise everything back to their original values (large numbers), and it does not look like any further development will happen either. What are some other options others have explored for Node.js?

    Read the article

  • More than one way to skin an Audit

    - by BuckWoody
    I get asked quite a bit about auditing in SQL Server. By "audit", people mean everything from tracking logins to finding out exactly who ran a particular SELECT statement. In the really early versions of SQL Server, we didn't have a great story for very granular audits, so lots of workarounds were suggested. As time progressed, more and more audit capabilities were added to the product, and in typical database platform fashion, as we added a feature we didn't often take the others away. So now, instead of not having an option to audit actions by users, you might face the opposite problem - too many ways to audit! You can read more about the options you have for tracking users here: http://msdn.microsoft.com/en-us/library/cc280526(v=SQL.100).aspx  In SQL Server 2008, we introduced SQL Server Audit, which uses Extended Events to really get a simple way to implement high-level or granular auditing. You can read more about that here: http://msdn.microsoft.com/en-us/library/dd392015.aspx  As with any feature, you should understand what your needs are first. Auditing isn't "free" in the performance sense, so you need to make sure you're only auditing what you need to. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Worthless Anti-Spam (What can we learn)

    - by smehaffie
    I recently can across a site that had a “anti-spam” field at the bottom of the entry from.  The first issue I had with it was that at 1280X800 you could not read the value you were suppose to enter (see below).  You tell me, should you enter div, dlv, piv, or plv. But even worse than not being readable at high resolutions is the fact that the programmer who coded it really did not understand what this was used for.  An anti-spam (aka: catpcha) entry field should not be able to be read by looking at the HTML DOM object (so entry of value cannot be scripted).  In this case the value is simply a disabled text input filed that has the value you need to type.  So a hacker would simply need to search for text input field named “spam2” and then they could flood the site with spam. 1: <td> 2: <label> 3: <input name="spam1" type="text" class="small" id="spam1" size="6" maxlength="3" /> 4: <input name="spam2" type="text" class="small" id="spam2" value="plv" 5: disabled="disabled" size="6" maxlength="3" /> 6: * <span class="small">- Anti-SPAM key - please enter matching value</span> 7: </label> 8: </td>   There are some things to learn from this example: 1) Always make sure you understand why you are coding a feature/function for any program you write.  Just following the requirements without realizing the “why” will sooner or later come back to bite you.  I think the above example appears to be an example of this. 2) Always check how the screen appears in different resolutions.  In this case it was pretty much unreadable in 1280x800, but you could read it in 800X600 (but most people I know do not have their resolution set that low).  Lucky for me I could “View Source” and get the value I needed to enter.

    Read the article

  • Digital HD Transition

    - by Bill Evjen
    The HD Experience Roughly 53% of the viewing public has HD capable devices in their home 24% think they are watching HD while they have no subscription to any HD content Today’s HD Considerations Choices abound: format resolution – 720p, 1080i/p frame rates compression and wrapping audio compression and delivery metadata packaging, delivery, and usage content delivery protocols Metadata is going to be a part of the overall experience With emerging technologies: Super Hi-Vision (SHV, UHDTV 4320p), 3D HEVC/H.265, WEBM/VP8 HDBaseT, P2PTV Dolby Pulse/HE-AAC Industry standardization Metadata registration, packaging, and delivery standards Improved picture and sound quality is a logical next step but we need to also think about the end to end viewing experience including; 3D video and audio content Mixed-mode viewing to bring interactive and immersive experiences Content Transportability both on-to and off-of the aircraft High Definition Standardization Analog switch off around the world DTV transition completed: 17 countries DTV transition in progress: 45 countries The EU has mandated the end of 2012 as the final date for Analog Switch Off D-Cinema was standardized by SMPTE in 2006 Airlines are installing HD displays today Passengers are bringing their own devices now HD TV on airlines are getting bigger and bigger – bigger than SD was – now up to 23” Gray scale data input for color – 6 to 8 bit Contrast – 400 to 700 Backlit – LED Encryption – can it be the same for HD? PPV in the cabin?

    Read the article

  • Challenging Job after Graduate Studies

    - by sriram
    I worked with an M.N.C developing web applications in Java/J2EE related technologies(includes JSF,struts,hibernate etc.) now I quit my job to pursue Graduate Studies in the U.S.A. So I am a student in the middle of my Graduate studies. I had enough of developing mere CRUD applications in J2EE now I want to work in something exciting. The problem is I can't say what exactly but I can give you an examples. Say developing new JDK libraries or writing a kernel for some O.S. or something like that. So I have five questions here. Is it true that people in R & D often use C++ because of higher performance in that case should I consider switching my platform to C/C++? How should I use my time I have one year to graduate to prepare myself for Jobs Interviews for such positions? (e.g. Reading books on Algorithms etc.) How do I know about these jobs and how do I apply to those Jobs? Is it the right time for me to think about Jobs? Am I over ambitious because I am not in a Ivy League, just a normal school? (My GPA is not so high unfortunately).

    Read the article

  • Usual Suspects: Typical 3rd Party Entities in E-Commerce [closed]

    - by zharvey
    I am doing some requirements/analysis for a web app that I'd like to build (Ruby/Java developer here). This web app would have a store front, shopping cart and would need to be totally compliant with all e-com best practices. It's amazing how much non-technical info comes up when you search for phrases like "how does e-commerce work", but very little comes up in the way of technical details. As such, I'm having extreme frustration finding answers to what I consider pretty straight-forward questions. I came here because I believe this question is not off-topic; if it is, please leave a comment as to why this question does not belong here and I will happily remove it myself (upvotes if your comment can point me to the correct place for this question!). So then: What 3rd parties will I need to work with to have a modern, web-compliant e-com site? So far I can account for a payment gateway provider like Authorize.net and an SSL certificate provider like Trustwave. Any others? What other standards besides PCI compliance will I be held to (besides governing laws, of course!)? Vulnerability scans: PCI compliance requires quarterly scans: if I'm a "Level 4" (low volume) Merchant does that still apply to me? Irregardless, my backend architecture is quite huge, with web servers, app servers, database, message brokers and more. Do each of these servers need to be scanned?!? If not what servers do need to get these quarterly scans? I usually hate to ask micro-questions inside of one large one, but these are so closely-related I just felt like asking them all separately would be spamming the site with too many petty questions. Thanks in advance!

    Read the article

  • ORM and component-based architecture

    - by EagleBeek
    I have joined an ongoing project, where the team calls their architecture "component-based". The lowest level is one big database. The data access (via ORM) and business layers are combined into various components, which are separated according to business logic. E.g., there's a component for handling bank accounts, one for generating invoices, etc. The higher levels of service contracts and presentation are irrelevant for the question, so I'll omit them here. From my point of view the separation of the data access layer into various components seems counterproductive, because it denies us the relational mapping capabilities of the ORM. E.g., when I want to query all invoices for one customer I have to identify the customer with the "customers" component and then make another call to the "invoices" component to get the invoices for this customer. My impression is that it would be better to leave the data access in one component and separate it from business logic, which may well be cut into various components. Does anybody have some advice? Have I overlooked something?

    Read the article

  • Internet slow on one router only [the problem only in Ubuntu] [on hold]

    - by mrSuperEvening
    Internet works perfectly on every other router, but browsing sucks at home (slow browsing and slow loading times). I changed DNS servers to 8.8.0.0, still doesn't help. And funnily, download speed is extremely high on this network (meaning torrents for example), but using browsers and loading websites is extremely slow (only on this network). Do I need to change something in router settings or what can I try? By the way, I use wired connection to router. EDIT: There's no problems when using Windows. EDIT: ifconfig: eth0 Link encap:Ethernet HWaddr f2:4d:a0:c0:3f:4c inet addr:192.168.11.8 Bcast:192.168.11.255 Mask:255.255.255.0 inet6 addr: fe80::f24d:a2ff:fec6:3f4c/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:206798 errors:0 dropped:0 overruns:0 frame:0 TX packets:219570 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:76680734 (76.6 MB) TX bytes:21738160 (21.7 MB) lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:65536 Metric:1 RX packets:160 errors:0 dropped:0 overruns:0 frame:0 TX packets:160 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:11094 (11.0 KB) TX bytes:11094 (11.0 KB)` ping -c 2 4.2.2.2 PING 4.2.2.2 (4.2.2.2) 56(84) bytes of data. --- 4.2.2.2 ping statistics --- 2 packets transmitted, 0 received, 100% packet loss, time 1007ms ping -c 2 google.com PING google.com (213.159.32.147) 56(84) bytes of data. 64 bytes from lan-213-159-32-147.kns.skynet.lv (213.159.32.147): icmp_seq=1 ttl=61 time=0.936 ms 64 bytes from lan-213-159-32-147.kns.skynet.lv (213.159.32.147): icmp_seq=2 ttl=61 time=0.937 ms --- google.com ping statistics --- 2 packets transmitted, 2 received, 0% packet loss, time 1001ms rtt min/avg/max/mdev = 0.936/0.936/0.937/0.030 ms

    Read the article

  • How do i approach this collision model?

    - by PeeS
    this is the game level prototype i have already implemented. It has few objects per room to allow me to finally add some collision detection/response code into it. VIDEO As you can probably see, every object inside has it's own AABB, even the room itself has AABB. So a player is like 'inside the Room AABB'. My player will be exactly inside the room, so he would have to collide correctly with those AABBs, so that when he hits any of those objects inside he get's a proper collision response from those AABB's. Now i would like to hear from you what kind of collision approach should i choose in here? How do i approach this kind of stuff: AABB to AABB collision detection then when this is positive go with AABB - Tri to find proper plane normal and calculate response ? AABB to AABB then when positive go with AABB - AABB Side check to find proper proper plane normal and calculate response? Anything else? How do you do this ? Many thanks.

    Read the article

  • LIfebook lh531 inbuilt card reader not working

    - by chandrasekar
    Inbuilt card reader (SD/PRO/SDHC) not working. When I insert the memory card the indicator comes for a milli second and nothing happens. When I do lspci it gives the out put which is pasted below: I use Ubuntu 11.10. Pl help pro-hq@prohq-LIFEBOOK-LH531:~$ lspci 00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09) 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04) 00:1a.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 05) 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 05) 00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b5) 00:1c.2 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 3 (rev b5) 00:1d.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 05) 00:1f.0 ISA bridge: Intel Corporation HM65 Express Chipset Family LPC Controller (rev 05) 00:1f.2 SATA controller: Intel Corporation 6 Series/C200 Series Chipset Family 6 port SATA AHCI Controller (rev 05) 00:1f.3 SMBus: Intel Corporation 6 Series/C200 Series Chipset Family SMBus Controller (rev 05) 01:00.0 Network controller: Intel Corporation Centrino Advanced-N 6205 (rev 34) 02:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 06) pro-hq@prohq-LIFEBOOK-LH531:~$

    Read the article

  • Implementation Specialist OPN Exam for OBI Suite 11g is Now LIVE

    - by Mike.Hallett(at)Oracle-BI&EPM
    The OPN specialisation Exam for implementation consultants of OBI Suite 11g is now live and ready for all partners.  You can now update your specialisation certification to the latest product version 11g for OBI: until recently, the accreditation had examined skills for OBI 10g. For more details see Oracle Business Intelligence Foundation Suite 11g Essentials (1Z1-591) where you can apply for this Oracle Implementation Specialist credential. This exam is primarily intended for consultants that are skilled in implementing solutions based on Oracle Business Intelligence Foundation Suite 11g. The certification covers skills such as: installing OBIEE, building the BI Server metadata repository, building BI dashboards, constructing ad hoc queries, defining security settings and configuring and managing cache files. The exam targets the intermediate-level implementation team member. Up-to-date training and field experience are recommended.   Also Note the New OPN on-line Sales & Pre-sales Assessment Tests are available @ Oracle Business Intelligence Foundation Suite 11g Sales Specialist Oracle Business Intelligence Foundation Suite 11g PreSales Specialist Oracle Business Intelligence Foundation Suite 11g Support Specialist FREE Certification Testing at OpenWorld ·       Are you attending OPN Exchange @ OpenWorld ? Then join us at OPN Specialist Test Fest! October 1st - 4th 2012, Marriott Marquis Hotel :  Pre-register here now!   For More Information OPN Certified Specialist exams OPN Certified Specialist FAQ Enablement 2.0 Get Specialized!

    Read the article

  • How can I disable recent documents in Unity?

    - by detly
    How do I disable the tracking and display of recently opened files (and whatever else is remembered) in a default installation of Ubuntu 11.10? (Note that this is not a duplicate of How can I keep recent files from appearing in Unity?, since that question and its answers are concerned with temporary and specific filtering. I want to disable it completely for a single user account.) Okay, to deflect the inevitable and expand on my motivation... While trawling the usual forums and Google results for a solution, it (unsurprisingly) seems that the near-universal use cases for this request are either browsing porn or Warhammer research. And the obvious solution to this is to create another user account to contain all evidence. However, this is not why I'm asking, and I don't say that to get all high and mighty about it, it's because this answer won't help. (Even though I really don't have any interest in Warhammer, and I have no idea how that paint pot and brush ended up in my drawer, no that's not glue on my thumb, etc.) My actual use case is that I use my personal laptop for presentations in different circles of my life. I have a user account set up with all the settings I like for presentations (shortcuts, small launcher, default associations, etc). But I don't want an accidental keystroke (or the find dialog) to display other recent presentations I've given, or the files I used in composing the presentation, or whatever. I also don't want to have to recreate this profile for every single presentation I might give. I just want a nice little isolated, memoryless, clean corner of my notebook for public display.

    Read the article

  • How to determine the amount to spend per phrase on Adwords research?

    - by Anonymous -
    My company would like to start a PPC advertising campaign. Whilst I understand the concept and how to set everything up from a technical point of view, this is something I've never done before. Logically, we'd like to test out a wide range of keywords that we think would lead to conversions, which we've put together through brainstorming and with some help from Google's External Keyword Tool. Sub-question whilst I remember - am I correct in thinking that in Google's keyword tool, keywords that we think will perform well that have a low competition yet high monthly searches are good since there will be less advertisers, meaning our bid per click will be less? Is there a common benchmark or process of doing a round of tests with keywords? Should we wait for 100 clicks on each keyword, see which ones have lead to the most sales (or rather, sales that are sustainable with the cost per click of that keyword), then drop the ones which aren't converting and put that budget onto the converting keywords? We realistically have a few hundred keywords/phrases we would like to test, but spending $100 per keyword/phrase is going to work out as quite an expensive test. It would be nice to be able to spend $5-10 per phrase, but I don't think the sample size would be great enough to determine anything usefully reliable. Another approach might be to setup all the keywords, and those that bring the most sales within x hours/days would be the ones we use. What is the common procedure with things like this? I know there are a plethora of companies that specialize in exactly this, but this is something we anticipate doing a lot in the future, so it would make sense to do it in house if at all possible.

    Read the article

  • How to determine if a 3D voxel-based room is sealed, efficiently

    - by NigelMan1010
    I've been having some issues with efficiently determining if large rooms are sealed in a voxel-based 3D rooms. I'm at a point where I have tried my hardest to solve the problem without asking for help, but not tried enough to give up, so I'm asking for help. To clarify, sealed being that there are no holes in the room. There are oxygen sealers, which check if the room is sealed, and seal depending on the oxygen input level. Right now, this is how I'm doing it: Starting at the block above the sealer tile (the vent is on the sealer's top face), recursively loop through in all 6 adjacent directions If the adjacent tile is a full, non-vacuum tile, continue through the loop If the adjacent tile is not full, or is a vacuum tile, check if it's adjacent blocks are, recursively. Each time a tile is checked, decrement a counter If the count hits zero, if the last block is adjacent to a vacuum tile, return that the area is unsealed If the count hits zero and the last block is not a vacuum tile, or the recursive loop ends (no vacuum tiles left) before the counter is zero, the area is sealed If the area is not sealed, run the loop again with some changes: Checking adjacent blocks for "breathable air" tile instead of a vacuum tile Instead of using a decrementing counter, continue until no adjacent "breathable air" tiles are found. Once loop is finished, set each checked block to a vacuum tile. Here's the code I'm using: http://pastebin.com/NimyKncC The problem: I'm running this check every 3 seconds, sometimes a sealer will have to loop through hundreds of blocks, and a large world with many oxygen sealers, these multiple recursive loops every few seconds can be very hard on the CPU. I was wondering if anyone with more experience with optimization can give me a hand, or at least point me in the right direction. Thanks a bunch.

    Read the article

  • Launcher icons are invisible after upgrade from 11.10 to 12.04

    - by Clo Knibbe
    I am re-purposing an old laptop. I installed 11.10 on it and then immediately upgraded to 12.04. (I could not directly install 12.04 as my system does not support PAE.) When my system was (briefly) 11.10, the desktop appeared as expected. However, after the upgrade to 12.04, the icons in the launcher area are invisible. If I hover over the spot where the icon should be the little popup window showing the tool's name appears, and I can click to invoke the tool. I just cannot see the icons. ![invisible icons in launcher][1] The icons do appear as expected in other contexts, for example in the Home folder and in Dash Home. My theme is "Ambiance (default)" I do not have a ~/.icons folder. This is the top level contents of /usr/share/icons: default DMZ-Black DMZ-White gnome handhelds hicolor HighContrast HighContrastInverse Humanity Humanity-Dark locolor LoginIcons LowContrast redglass ubuntu-mono-dark ubuntu-mono-light unity-icon-theme whiteglass (Sorry for the poor formatting, can't get it to show in list.) I suspect that the launcher isn't looking for the icons in the right place, but I don't know how to confirm that, or how to correct. This is my first foray into Linux, although I used to use Unix a few decades ago. This doesn't look much like my old Sun workstation, though! Does anyone have any suggestions or insights for me? Thanks.

    Read the article

  • Is there a cheaper non-express non-student, non-msdn version of Visual Studio 2010 that supports plugins in the US than the $710 Professional Edition?

    - by Justin Dearing
    I've never actually purchased a copy of Visual Studio myself. SharpDevelop and Express edition have always been good enough for my personal use, and my employers always furnished me with the IDEs I needed to serve them. However, I'm thinking of actually paying for a copy for my personal laptop. I need this mainly so I can open solutions that contain web projects. So my question is: Is there an edition cheaper than the $710 Pro edition on Amazon that will do what I need: http://www.amazon.com/Microsoft-C5E-00521-Visual-Studio-Professional/dp/B0038KTO8S/ref=sr_1_2?ie=UTF8&qid=1287456230&sr=8-2 ? What I need is defined as: Open up a solution with C#, Web App, VB.NET, and Web Projects. Install addins like resharper, testdriven.net, etc, SCM plugins, etc. Some level of db project support. At least to be able to open a dbproj. I only need that for SCM hooks. SSMS and SQLCMD are good enough for actually editing databases. Ability to install F#, IronPython, IronRuby etc. Now naturally I'm a fairly intelligent resourceful person so I realize I can get Visual Studio in a questionable manner. Thats not what I'm looking to do. I want a legal copy, I don't want a student copy, or an MSDN copy. I want a real copy, I just want to make sure I get the cheapest edition that serves my needs.

    Read the article

  • CodePlex Daily Summary for Sunday, October 28, 2012

    CodePlex Daily Summary for Sunday, October 28, 2012Popular ReleasesPlayer Framework by Microsoft: Player Framework for Windows 8 (Preview 7): This release is compatible with the version of the Smooth Streaming SDK released today (10/26). Release 1 of the player framework is expected to be available next week. IMPROVEMENTS & FIXESIMPORTANT: List of breaking changes from preview 6 Support for the latest smooth streaming SDK. Xaml only: Support for moving any of the UI elements outside the MediaPlayer (e.g. into the appbar). Note: Equivelent changes to the JS version due in coming week. Support for localizing all text used in t...Send multiple SMS via Way2SMS C#: SMS 1.1: Added support for 160by2Quick Launch: Quick Launch 1.0: A Lightweight and Fast Way to Manage and Launch Thousands of Tools and ApplicationsPress Win+Q and start to search and run. http://www.codeplex.com/Download?ProjectName=quicklaunch&DownloadId=523536Orchard Project: Orchard 1.6: Please read our release notes for Orchard 1.6: http://docs.orchardproject.net/Documentation/Orchard-1-6-Release-Notes Please do not post questions as reviews. Questions should be posted in the Discussions tab, where they will usually get promptly responded to. If you post a question as a review, you will pollute the rating, and you won't get an answer.ZXMAK2: Version 2.6.7.0: - small performance improvements - fix & improvements for Direct3D renderer (thanks to zebest for testing)Media Companion: Media Companion 3.507b: Once again, it has been some time since our release, and there have been a number changes since then. It is hoped that these changes will address some of the issues users have been experiencing, and of course, work continues! New Features: Added support for adding Home Movies. Option to sort Movies by votes. Added 'selectedBrowser' preference used when opening links in an external browser. Added option to fallback to getting runtime from the movie file if not available on IMDB. Added new Big...MSBuild Extension Pack: October 2012: Release Blog Post The MSBuild Extension Pack October 2012 release provides a collection of over 475 MSBuild tasks. A high level summary of what the tasks currently cover includes the following: System Items: Active Directory, Certificates, COM+, Console, Date and Time, Drives, Environment Variables, Event Logs, Files and Folders, FTP, GAC, Network, Performance Counters, Registry, Services, Sound Code: Assemblies, AsyncExec, CAB Files, Code Signing, DynamicExecute, File Detokenisation, GUI...NAudio: NAudio 1.6: Release notes at http://mark-dot-net.blogspot.co.uk/2012/10/naudio-16-release-notes-10th.htmlPowerShell Community Extensions: 2.1 Production: PowerShell Community Extensions 2.1 Release NotesOct 25, 2012 This version of PSCX supports both Windows PowerShell 2.0 and 3.0. See the ReleaseNotes.txt download above for more information.Umbraco CMS: Umbraco 4.9.1: Umbraco 4.9.1 is a bugfix release to fix major issues in 4.9.0 BugfixesThe full list of fixes can be found in the issue tracker's filtered results. A summary: Split buttons work again, you can now also scroll easier when the list is too long for the screen Media and Content pickers have information of the full path of the picked item Fixed: Publish status may not be accurate on nodes with large doctypes Fixed: 2 media folders and recycle bins after upgrade to 4.9 The template/code ...AcDown????? - AcDown Downloader Framework: AcDown????? v4.2.2: ??●AcDown??????????、??、??、???????。????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。 ●??????AcPlay?????,??????、????????????????。 ● AcDown??????????????????,????????????????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ???? 32??64? ???Linux ????(1)????????Windows XP???,????????.NET Framework 2.0???(x86),?????"?????????"??? (2)???????????Linux???,????????Mono?? ??2...Rawr: Rawr 5.0.2: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr Addon (NOT UPDATED YET FOR MOP)We now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including ba...MCEBuddy 2.x: MCEBuddy 2.3.5: Changelog for 2.3.5 (32bit and 64bit) 1. Fixed a bug causing MCEBuddy to crash during or after installation on Windows XP 2. Bugfix for resource leak with UPnP which would lead to a failure after many days 3. Increased the UPnP discovery re-scan interval from 10 minutes to 30 minutes 4. Added support for specifying TVDB and IMDB id’s in the conversion task page (forcing the internet lookup for metadata)CRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1025.5): [NEW] Support for connecting to CRM Online via Office 365 (OSDP) [NEW] Current connection information and loaded ribbon name are displayed in the status bar [IMPROVED] Connect dialog minor improvements and error message descriptions [IMPROVED] Connecting to a CRM server will close currently loaded ribbon upon confirmation (if another ribbon was loaded previously) [FIX] Fixed bug in Open Ribbon dialog which would not allow to refresh entity list more than onceReadable Passphrase Generator: KeePass Plugin 0.8.0: Changes: Interrogative phrases (questions) like why did the statesman burgle amidst lucid sunlamps Support transitive / intransitive verbs (whether a verb needs a subject or not). Change adverbs to be either before or after the verb, at random. Add an "equal" version of each strength, where each possibility is equally likely (for password purists). 3401 words in the default dictionary (~400 more than previous release) Fixed bugs when choosing verb tenseseWay payment gateway provider for NB_Store: NB_Store_Gateway_eWay: Install package for eWAY gateway for NB_Store.fastJSON: v2.0.9: - added support for root level DataSet and DataTable deserialize (you have to do ToObject<DataSet>(...) ) - added dataset testsMicrosoft Ajax Minifier: Microsoft Ajax Minifier 4.72: Fix for Issue #18819 - bad optimization of return/assign operator.WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.390: Version 2.5.0.390 (Release Candidate): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Fix recent file list remove issue. WAF: Minor code improvements. BookLibrary: Fix Blend design time support o...Fiskalizacija za developere: FiskalizacijaDev 1.1: Ovo je prva nadogradnja ovog projekta nakon inicijalnog predstavljanja - dodali smo nekoliko feature-a, bilo zato što smo sami primijetili da bi ih bilo dobro dodati, bilo na osnovu vaših sugestija - hvala svima koji su se ukljucili :) Ovo su stvari riješene u v1.1.: 1. Bilo bi dobro da se XML dokument koji se šalje u CIS može snimiti u datoteku (http://fiskalizacija.codeplex.com/workitem/612) 2. Podrška za COM DLL (VB6) (http://fiskalizacija.codeplex.com/workitem/613) 3. Podrška za DOS (unu...New ProjectsAnalytics for Windows 8: SIlverlight / WPF User control integrated in WPF / SILVERLIGHT / WINDOWS PHONE Apps in order to collect user data ( Action , Timespan etc ) . bGui - User Interfaces in Batch: A small project for allowing basic user interface creation in batch. Still under strict development, see documentation for current functionality.Blog Projects: Projects wich i use in my blog postsCloud Sync Service: This Windows Service lets you sync your files across other machines by using Cloud File Storage as gateway. Amazon S3 and Windows Azure supported.CocoaVP8: CocoaVP8 is a dual-licensed commercial/GPL native library which allows Mac and iOS applications to easily use the VP8 video codec.DarkSky Orchard Microdata: Microdata is an Orchard module that exposes a MicrodataPart that will render microdata for your content items.Gerenciador Paroquial: Um conjunto de ferramentas desenvolvido para auxiliar a administração de comunidades eclesiásticas episcopais.Quick Launch: Quick Launch - a New Lightweight and Fast Way to Manage and Launch Thousands of Tools and ApplicationsRDI Alef: RDI ALEF (ALef Entities Framework) é uma biblioteca desenvolvida em C# para persistência de objetos em um banco de dados. Encapsula um manipulador SQL e o funciSend multiple SMS via Way2SMS C#: Way2SMS, 160BY2, SMS, INDIATrending: Trending program; calculates the highest and lowest changes in frequency of specific words in two files.

    Read the article

  • Service layer coupling

    - by Justin
    I am working on writing a service layer for an order system in php. It's the typical scenario, you have an Order that can have multiple Line Items. So lets say a request is received to store a line item with pictures and comments. I might receive a json request such as { 'type': 'Bike', 'color': 'Red', 'commentIds': [3193,3194] 'attachmentIds': [123,413] } My idea was to have a Service_LineItem_Bike class that knows how to take the json data and store an entity for a bike. My question is, the Service_LineItem class now needs to fetch comments and file attachments, and store the relationships. Service_LineItem seems like it should interact with a Service_Comment and a Service_FileUpload. Should instances of these two other services be instantiated and passed to the Service_LineItem constructor,or set by getters and setters? Dependency injection seems like the right solution, allowing a service access to a 'service fetching helper' seems wrong, and this should stay at the application level. I am using Doctrine 2 as a ORM, and I can technically write a dql query inside Service_LineItem to fetch the comments and file uploads necessary for the association, but this seems like it would have a tighter coupling, rather then leaving this up to the right service object.

    Read the article

  • Setting up a shared media drive

    - by Sam Brightman
    I want to have a shared media drive be transparently usable to all users, whilst also sticking to FHS and Ubuntu standards. The former takes priority if necessary. I currently mount it at /media/Stuff but /media is supposed to be for external media, I believe. The main issue is setting permissions so that access to read and write to the drive can be granted to multiple users working within the same directories. InstallingANewHardDrive seems both slightly confused and not what I want. It claims that this sets ownership for the top-level directory (despite the recursion flag): sudo chown -R USERNAME:USERNAME /media/mynewdrive And that this will let multiple users create files and sub-directories but only delete their own: sudo chgrp plugdev /media/mynewdrive sudo chmod g+w /media/mynewdrive sudo chmod +t /media/mynewdrive However, the group writeable bit does not seem to get inherited, which is troublesome for keeping things organised (prevents creation inside sub-folders originally made by another user). The sticky bit is probably also unwanted for the same reason, although currently it seems that one userA (perhaps the owner of the mount-point?) can delete the userB's files, but not vice-versa. This is fine, as long as userB can create files inside the directory of userA. So: What is the correct mount point? Is plugdev the correct group? Most importantly, how to set up permissions to maintain an organised media drive? I do not want to be running cron jobs to set permissions regularly!

    Read the article

  • Extract all related class type aliasing and enum into one file or not

    - by Chen OT
    I have many models in my project, and some other classes just need the class declaration and pointer type aliasing. It does not need to know the class definition, so I don't want to include the model header file. I extract all the model's declaration into one file to let every classes reference one file. model_forward.h class Cat; typedef std::shared_ptr<Cat> CatPointerType; typedef std::shared_ptr<const Cat> CatConstPointerType; class Dog; typedef std::shared_ptr<Dog> DogPointerType; typedef std::shared_ptr<const Dog> DogConstPointerType; class Fish; typedef std::shared_ptr<Fish> FishPointerType; typedef std::shared_ptr<const Fish> FishConstPointerType; enum CatType{RED_CAT, YELLOW_CAT, GREEN_CAT, PURPLE_CAT} enum DogType{HATE_CAT_DOG, HUSKY, GOLDEN_RETRIEVER} enum FishType{SHARK, OCTOPUS, SALMON} Is it acceptable practice? Should I make every unit, which needs a class declaration, depends on one file? Does it cause high coupling? Or I should put these pointer type aliasing and enum definition inside the class back? cat.h class Cat { typedef std::shared_ptr<Cat> PointerType; typedef std::shared_ptr<const Cat> ConstPointerType; enum Type{RED_CAT, YELLOW_CAT, GREEN_CAT, PURPLE_CAT} ... }; dog.h class Dog { typedef std::shared_ptr<Dog> PointerType; typedef std::shared_ptr<const Dog> ConstPointerType; enum Type{HATE_CAT_DOG, HUSKY, GOLDEN_RETRIEVER} ... } fish.h class Fish { ... }; Any suggestion will be helpful.

    Read the article

  • Constituent Experience Counts In Public Sector

    - by Michael Seback
      Businesses and government organizations are operating in an era of the empowered customer where service  and communication channels are challenged every day.  Consumers in the private sector have high expectations from purchasing gifts online, reading reviews on social sites, and expecting the companies they do business with to know and reward them.   In the Public Sector, constituents also expect government organizations to provide consistent and timely service across agencies and touch points.  Examples include requesting critical city services, applying for social assistance or reviewing insurance plans for a health insurance exchange. If an individual does not receive the services they need at the right time and place, it can create a dire situation – involving housing, food or healthcare assistance. Government organizations need to deliver a fast, reliable and personalized experience to constituents. Look at a few recent statistics from a Government focused survey: How do you define good customer service? 70 % improved services, 48% shortest time to provide information, 44% shortest time to resolve complaints What are ways/opportunities to improve customer service? 69% increased collaboration across agencies and 41% increased customer service channels Are you using data collected to make informed decisionsto improve customer service efforts? 39% data collection is limited, not used to improve decision making Source: Re-Imagining Customer Service in Government, 2012 Click here to see the highlights.  Would you like to get started – read Eight Steps to great constituent experiences for government.

    Read the article

  • Why should I use MSBuild instead of Visual Studio Solution files?

    - by Sid
    We're using TeamCity for continuous integration and it's building our releases via the solution file (.sln). I've used Makefiles in the past for various systems but never msbuild (which I've heard is sorta like Makefiles + XML mashup). I've seen many posts on how to use msbuild directly instead of the solution files but I don't see a very clear answer on why to do it. So, why should we bother migrating from solution files to an MSBuild 'makefile'? We do have a a couple of releases that differ by a #define (featurized builds) but for the most part everything works. The bigger concern is that now we'd have to maintain two systems when adding projects/source code. UPDATE: Can folks shed light on the lifecycle and interplay of the following three components? The Visual Studio .sln file The many project level .csproj files (which I understand an "sub" msbuild scripts) The custom msbuild script Is it safe to say that the .sln and .csproj are consumed/maintained as usual from within the Visual Studio IDE GUI while the custom msbuild script is hand-written and usually consumes the already existing individual .csproj "as-is"? That's one way I can see reduce overlap/duplicate in maintenance... Would appreciate some light on this from other folks' operational experience

    Read the article

  • Project Jigsaw: Late for the train: The Q&A

    - by Mark Reinhold
    I recently proposed, to the Java community in general and to the SE 8 (JSR 337) Expert Group in particular, to defer Project Jigsaw from Java 8 to Java 9. I also proposed to aim explicitly for a regular two-year release cycle going forward. Herewith a summary of the key questions I’ve seen in reaction to these proposals, along with answers. Making the decision Q Has the Java SE 8 Expert Group decided whether to defer the addition of a module system and the modularization of the Platform to Java SE 9? A No, it has not yet decided. Q By when do you expect the EG to make this decision? A In the next month or so. Q How can I make sure my voice is heard? A The EG will consider all relevant input from the wider community. If you have a prominent blog, column, or other communication channel then there’s a good chance that we’ve already seen your opinion. If not, you’re welcome to send it to the Java SE 8 Comments List, which is the EG’s official feedback channel. Q What’s the overall tone of the feedback you’ve received? A The feedback has been about evenly divided as to whether Java 8 should be delayed for Jigsaw, Jigsaw should be deferred to Java 9, or some other, usually less-realistic, option should be taken. Project Jigsaw Q Why is Project Jigsaw taking so long? A Project Jigsaw started at Sun, way back in August 2008. Like many efforts during the final years of Sun, it was not well staffed. Jigsaw initially ran on a shoestring, with just a handful of mostly part-time engineers, so progress was slow. During the integration of Sun into Oracle all work on Jigsaw was halted for a time, but it was eventually resumed after a thorough consideration of the alternatives. Project Jigsaw was really only fully staffed about a year ago, around the time that Java 7 shipped. We’ve added a few more engineers to the team since then, but that can’t make up for the inadequate initial staffing and the time lost during the transition. Q So it’s really just a matter of staffing limitations and corporate-integration distractions? A Aside from these difficulties, the other main factor in the duration of the project is the sheer technical difficulty of modularizing the JDK. Q Why is modularizing the JDK so hard? A There are two main reasons. The first is that the JDK code base is deeply interconnected at both the API and the implementation levels, having been built over many years primarily in the style of a monolithic software system. We’ve spent considerable effort eliminating or at least simplifying as many API and implementation dependences as possible, so that both the Platform and its implementations can be presented as a coherent set of interdependent modules, but some particularly thorny cases remain. Q What’s the second reason? A We want to maintain as much compatibility with prior releases as possible, most especially for existing classpath-based applications but also, to the extent feasible, for applications composed of modules. Q Is modularizing the JDK even necessary? Can’t you just put it in one big module? A Modularizing the JDK, and more specifically modularizing the Java SE Platform, will enable standard yet flexible Java runtime configurations scaling from large servers down to small embedded devices. In the long term it will enable the convergence of Java SE with the higher-end Java ME Platforms. Q Is Project Jigsaw just about modularizing the JDK? A As originally conceived, Project Jigsaw was indeed focused primarily upon modularizing the JDK. The growing demand for a truly standard module system for the Java Platform, which could be used not just for the Platform itself but also for libraries and applications built on top of it, later motivated expanding the scope of the effort. Q As a developer, why should I care about Project Jigsaw? A The introduction of a modular Java Platform will, in the long term, fundamentally change the way that Java implementations, libraries, frameworks, tools, and applications are designed, built, and deployed. Q How much progress has Project Jigsaw made? A We’ve actually made a lot of progress. Much of the core functionality of the module system has been prototyped and works at both compile time and run time. We’ve extended the Java programming language with module declarations, worked out a structure for modular source trees and corresponding compiled-class trees, and implemented these features in javac. We’ve defined an efficient module-file format, extended the JVM to bootstrap a modular JRE, and designed and implemented a preliminary API. We’ve used the module system to make a good first cut at dividing the JDK and the Java SE API into a coherent set of modules. Among other things, we’re currently working to retrofit the java.util.ServiceLoader API to support modular services. Q I want to help! How can I get involved? A Check out the project page, read the draft requirements and design overview documents, download the latest prototype build, and play with it. You can tell us what you think, and follow the rest of our work in real time, on the jigsaw-dev list. The Java Platform Module System JSR Q What’s the relationship between Project Jigsaw and the eventual Java Platform Module System JSR? A At a high level, Project Jigsaw has two phases. In the first phase we’re exploring an approach to modularity that’s markedly different from that of existing Java modularity solutions. We’ve assumed that we can change the Java programming language, the virtual machine, and the APIs. Doing so enables a design which can strongly enforce module boundaries in all program phases, from compilation to deployment to execution. That, in turn, leads to better usability, diagnosability, security, and performance. The ultimate goal of the first phase is produce a working prototype which can inform the work of the Module-System JSR EG. Q What will happen in the second phase of Project Jigsaw? A The second phase will produce the reference implementation of the specification created by the Module-System JSR EG. The EG might ultimately choose an entirely different approach than the one we’re exploring now. If and when that happens then Project Jigsaw will change course as necessary, but either way I think that the end result will be better for having been informed by our current work. Maven & OSGi Q Why not just use Maven? A Maven is a software project management and comprehension tool. As such it can be seen as a kind of build-time module system but, by its nature, it does nothing to support modularity at run time. Q Why not just adopt OSGi? A OSGi is a rich dynamic component system which includes not just a module system but also a life-cycle model and a dynamic service registry. The latter two facilities are useful to some kinds of sophisticated applications, but I don’t think they’re of wide enough interest to be standardized as part of the Java SE Platform. Q Okay, then why not just adopt the module layer of OSGi? A The OSGi module layer is not operative at compile time; it only addresses modularity during packaging, deployment, and execution. As it stands, moreover, it’s useful for library and application modules but, since it’s built strictly on top of the Java SE Platform, it can’t be used to modularize the Platform itself. Q If Maven addresses modularity at build time, and the OSGi module layer addresses modularity during deployment and at run time, then why not just use the two together, as many developers already do? A The combination of Maven and OSGi is certainly very useful in practice today. These systems have, however, been built on top of the existing Java platform; they have not been able to change the platform itself. This means, among other things, that module boundaries are weakly enforced, if at all, which makes it difficult to diagnose configuration errors and impossible to run untrusted code securely. The prototype Jigsaw module system, by contrast, aims to define a platform-level solution which extends both the language and the JVM in order to enforce module boundaries strongly and uniformly in all program phases. Q If the EG chooses an approach like the one currently being taken in the Jigsaw prototype, will Maven and OSGi be made obsolete? A No, not at all! No matter what approach is taken, to ensure wide adoption it’s essential that the standard Java Platform Module System interact well with Maven. Applications that depend upon the sophisticated features of OSGi will no doubt continue to use OSGi, so it’s critical that implementations of OSGi be able to run on top of the Java module system and, if suitably modified, support OSGi bundles that depend upon Java modules. Ideas for how to do that are currently being explored in Project Penrose. Java 8 & Java 9 Q Without Jigsaw, won’t Java 8 be a pretty boring release? A No, far from it! It’s still slated to include the widely-anticipated Project Lambda (JSR 335), work on which has been going very well, along with the new Date/Time API (JSR 310), Type Annotations (JSR 308), and a set of smaller features already in progress. Q Won’t deferring Jigsaw to Java 9 delay the eventual convergence of the higher-end Java ME Platforms with Java SE? A It will slow that transition, but it will not stop it. To allow progress toward that convergence to be made with Java 8 I’ve suggested to the Java SE 8 EG that we consider specifying a small number of Profiles which would allow compact configurations of the SE Platform to be built and deployed. Q If Jigsaw is deferred to Java 9, would the Oracle engineers currently working on it be reassigned to other Java 8 features and then return to working on Jigsaw again after Java 8 ships? A No, these engineers would continue to work primarily on Jigsaw from now until Java 9 ships. Q Why not drop Lambda and finish Jigsaw instead? A Even if the engineers currently working on Lambda could instantly switch over to Jigsaw and immediately become productive—which of course they can’t—there are less than nine months remaining in the Java 8 schedule for work on major features. That’s just not enough time for the broad review, testing, and feedback which such a fundamental change to the Java Platform requires. Q Why not ship the module system in Java 8, and then modularize the platform in Java 9? A If we deliver a module system in one release but don’t use it to modularize the JDK until some later release then we run a big risk of getting something fundamentally wrong. If that happens then we’d have to fix it in the later release, and fixing fundamental design flaws after the fact almost always leads to a poor end result. Q Why not ship Jigsaw in an 8.5 release, less than two years after 8? Or why not just ship a new release every year, rather than every other year? A Many more developers work on the JDK today than a couple of years ago, both because Oracle has dramatically increased its own investment and because other organizations and individuals have joined the OpenJDK Community. Collectively we don’t, however, have the bandwidth required to ship and then provide long-term support for a big JDK release more frequently than about every other year. Q What’s the feedback been on the two-year release-cycle proposal? A For just about every comment that we should release more frequently, so that new features are available sooner, there’s been another asking for an even slower release cycle so that large teams of enterprise developers who ship mission-critical applications have a chance to migrate at a comfortable pace.

    Read the article

  • Ubuntu sound volume is 40% lower than in Windows

    - by ncomx
    I have 2x 2.1 speakers connected to the computer where I have Ubuntu 12.04 installed. On the software side I've set all the volume controls to 100% with the alsamixer program. The speakers have their own volume control, maintaining those at the same level, and switching between Ubuntu and Windows (XP and 7), on windows the output volume is at least 40% higher, even when having the windows volume control at 50% (without touching the speakers volume control) it's still much higher than the sound on Ubuntu. Why can this be happening? Are there some alternative sound drivers (other than the default ones) I could test to see if it makes a difference? some info about the card: root:$ cat /proc/asound/cards 0 [PCH ]: HDA-Intel - HDA Intel PCH HDA Intel PCH at 0xfbff4000 irq 55 1 [Generic ]: HDA-Intel - HD-Audio Generic HD-Audio Generic at 0xfbcfc000 irq 56 root:$ lspci | grep -i audio 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 04) 02:00.1 Audio device: Advanced Micro Devices [AMD] nee ATI Cayman/Antilles HDMI Audio [Radeon HD 6900 Series] I think the one i am using is the Intel one, the other seems to be from the vga card which is an ati radeon 6950. Running gstreamer-properties and switching between alsa, oss, ossv4 and pulseaudio doesn't seem to make any difference.

    Read the article

  • How do I mount a CIFS share via FSTAB and give full RW to Guest

    - by Kendor
    I want to create a Public folder that has full RW access. The problem with my configuration is that Windows users have no issues as guests (they can RW and Delete), my Ubuntu client can't do the same. We can only write and read, but not create or delete. Here is the my smb.conf from my server: [global] workgroup = WORKGROUP netbios name = FILESERVER server string = TurnKey FileServer os level = 20 security = user map to guest = Bad Password passdb backend = tdbsam null passwords = yes admin users = root encrypt passwords = true obey pam restrictions = yes pam password change = yes unix password sync = yes passwd program = /usr/bin/passwd %u passwd chat = *Enter\snew\s*\spassword:* %n\n *Retype\snew\s*\spassword:* %n\n *password\supdated\ssuccessfully* . add user script = /usr/sbin/useradd -m '%u' -g users -G users delete user script = /usr/sbin/userdel -r '%u' add group script = /usr/sbin/groupadd '%g' delete group script = /usr/sbin/groupdel '%g' add user to group script = /usr/sbin/usermod -G '%g' '%u' guest account = nobody syslog = 0 log file = /var/log/samba/samba.log max log size = 1000 wins support = yes dns proxy = no socket options = TCP_NODELAY panic action = /usr/share/samba/panic-action %d [homes] comment = Home Directory browseable = no read only = no valid users = %S [storage] create mask = 0777 directory mask = 0777 browseable = yes comment = Public Share writeable = yes public = yes path = /srv/storage The following FSTAB entry doesn't yield full R/W access to the share. //192.168.0.5/storage /media/myname/TK-Public/ cifs rw 0 0 This doesn't work either //192.168.0.5/storage /media/myname/TK-Public/ cifs rw,guest,iocharset=utf8,file_mode=0777,dir_mode=0777,noperm 0 0 Using the following location in Nemo/Nautilus w/o the Share being mounted does work: smb://192.168.0.5/storage/ Extra info. I just noticed that if I copy a file to the share after mounting, my Ubuntu client immediately make "nobody" be the owner, and the group "no group" has read and write, with everyone else as read-only. What am I doing wrong?

    Read the article

< Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >